Dec 05 20:05:33 crc systemd[1]: Starting Kubernetes Kubelet... Dec 05 20:05:33 crc restorecon[4704]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:05:33 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:05:34 crc restorecon[4704]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:05:34 crc restorecon[4704]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 05 20:05:34 crc kubenswrapper[4885]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 20:05:34 crc kubenswrapper[4885]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 05 20:05:34 crc kubenswrapper[4885]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 20:05:34 crc kubenswrapper[4885]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 20:05:34 crc kubenswrapper[4885]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 05 20:05:34 crc kubenswrapper[4885]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.974166 4885 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980576 4885 feature_gate.go:330] unrecognized feature gate: Example Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980621 4885 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980636 4885 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980651 4885 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980663 4885 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980674 4885 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980684 4885 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980696 4885 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980706 4885 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980720 4885 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980733 4885 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980744 4885 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980757 4885 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980769 4885 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980781 4885 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980791 4885 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980802 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980812 4885 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980823 4885 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980833 4885 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980843 4885 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980853 4885 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980863 4885 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980874 4885 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980886 4885 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980897 4885 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980908 4885 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980919 4885 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980929 4885 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980941 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980951 4885 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980963 4885 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980973 4885 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980984 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.980994 4885 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981005 4885 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981015 4885 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981059 4885 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981069 4885 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981080 4885 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981090 4885 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981100 4885 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981110 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981121 4885 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981131 4885 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981142 4885 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981152 4885 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981164 4885 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981174 4885 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981184 4885 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981195 4885 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981207 4885 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981225 4885 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981240 4885 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981255 4885 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981270 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981282 4885 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981293 4885 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981304 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981315 4885 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981329 4885 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981340 4885 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981351 4885 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981363 4885 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981375 4885 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981386 4885 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981397 4885 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981411 4885 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981421 4885 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981435 4885 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.981446 4885 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981641 4885 flags.go:64] FLAG: --address="0.0.0.0" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981671 4885 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981700 4885 flags.go:64] FLAG: --anonymous-auth="true" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981718 4885 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981734 4885 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981747 4885 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981763 4885 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981780 4885 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981793 4885 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981805 4885 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981819 4885 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981833 4885 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981846 4885 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981859 4885 flags.go:64] FLAG: --cgroup-root="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981872 4885 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981886 4885 flags.go:64] FLAG: --client-ca-file="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981899 4885 flags.go:64] FLAG: --cloud-config="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981911 4885 flags.go:64] FLAG: --cloud-provider="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981923 4885 flags.go:64] FLAG: --cluster-dns="[]" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981937 4885 flags.go:64] FLAG: --cluster-domain="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981950 4885 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981963 4885 flags.go:64] FLAG: --config-dir="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981974 4885 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.981987 4885 flags.go:64] FLAG: --container-log-max-files="5" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982002 4885 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982014 4885 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982064 4885 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982077 4885 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982089 4885 flags.go:64] FLAG: --contention-profiling="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982101 4885 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982114 4885 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982128 4885 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982143 4885 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982158 4885 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982172 4885 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982184 4885 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982197 4885 flags.go:64] FLAG: --enable-load-reader="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982210 4885 flags.go:64] FLAG: --enable-server="true" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982222 4885 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982238 4885 flags.go:64] FLAG: --event-burst="100" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982251 4885 flags.go:64] FLAG: --event-qps="50" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982264 4885 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982276 4885 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982288 4885 flags.go:64] FLAG: --eviction-hard="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982303 4885 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982315 4885 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982328 4885 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982342 4885 flags.go:64] FLAG: --eviction-soft="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982354 4885 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982367 4885 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982379 4885 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982391 4885 flags.go:64] FLAG: --experimental-mounter-path="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982405 4885 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982417 4885 flags.go:64] FLAG: --fail-swap-on="true" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982429 4885 flags.go:64] FLAG: --feature-gates="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982443 4885 flags.go:64] FLAG: --file-check-frequency="20s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982456 4885 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982470 4885 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982484 4885 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982496 4885 flags.go:64] FLAG: --healthz-port="10248" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982509 4885 flags.go:64] FLAG: --help="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982521 4885 flags.go:64] FLAG: --hostname-override="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982535 4885 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982548 4885 flags.go:64] FLAG: --http-check-frequency="20s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982560 4885 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982572 4885 flags.go:64] FLAG: --image-credential-provider-config="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982585 4885 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982597 4885 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982612 4885 flags.go:64] FLAG: --image-service-endpoint="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982624 4885 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982638 4885 flags.go:64] FLAG: --kube-api-burst="100" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982651 4885 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982665 4885 flags.go:64] FLAG: --kube-api-qps="50" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982678 4885 flags.go:64] FLAG: --kube-reserved="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982690 4885 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982702 4885 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982714 4885 flags.go:64] FLAG: --kubelet-cgroups="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982726 4885 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982739 4885 flags.go:64] FLAG: --lock-file="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982754 4885 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982766 4885 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982806 4885 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982826 4885 flags.go:64] FLAG: --log-json-split-stream="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982839 4885 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982852 4885 flags.go:64] FLAG: --log-text-split-stream="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982864 4885 flags.go:64] FLAG: --logging-format="text" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982877 4885 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982891 4885 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982905 4885 flags.go:64] FLAG: --manifest-url="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982917 4885 flags.go:64] FLAG: --manifest-url-header="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982934 4885 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982947 4885 flags.go:64] FLAG: --max-open-files="1000000" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982962 4885 flags.go:64] FLAG: --max-pods="110" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982974 4885 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.982988 4885 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983000 4885 flags.go:64] FLAG: --memory-manager-policy="None" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983013 4885 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983062 4885 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983076 4885 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983089 4885 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983119 4885 flags.go:64] FLAG: --node-status-max-images="50" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983134 4885 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983147 4885 flags.go:64] FLAG: --oom-score-adj="-999" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983160 4885 flags.go:64] FLAG: --pod-cidr="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983240 4885 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983310 4885 flags.go:64] FLAG: --pod-manifest-path="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983324 4885 flags.go:64] FLAG: --pod-max-pids="-1" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983336 4885 flags.go:64] FLAG: --pods-per-core="0" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983347 4885 flags.go:64] FLAG: --port="10250" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983359 4885 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983370 4885 flags.go:64] FLAG: --provider-id="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983380 4885 flags.go:64] FLAG: --qos-reserved="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983391 4885 flags.go:64] FLAG: --read-only-port="10255" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983400 4885 flags.go:64] FLAG: --register-node="true" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983410 4885 flags.go:64] FLAG: --register-schedulable="true" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983419 4885 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983444 4885 flags.go:64] FLAG: --registry-burst="10" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983454 4885 flags.go:64] FLAG: --registry-qps="5" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983463 4885 flags.go:64] FLAG: --reserved-cpus="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983472 4885 flags.go:64] FLAG: --reserved-memory="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983484 4885 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983494 4885 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983503 4885 flags.go:64] FLAG: --rotate-certificates="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983513 4885 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983522 4885 flags.go:64] FLAG: --runonce="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983532 4885 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983542 4885 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983553 4885 flags.go:64] FLAG: --seccomp-default="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983563 4885 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983572 4885 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983584 4885 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983593 4885 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983603 4885 flags.go:64] FLAG: --storage-driver-password="root" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983612 4885 flags.go:64] FLAG: --storage-driver-secure="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983623 4885 flags.go:64] FLAG: --storage-driver-table="stats" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983634 4885 flags.go:64] FLAG: --storage-driver-user="root" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983645 4885 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983660 4885 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983674 4885 flags.go:64] FLAG: --system-cgroups="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983686 4885 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983711 4885 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983722 4885 flags.go:64] FLAG: --tls-cert-file="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983731 4885 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983747 4885 flags.go:64] FLAG: --tls-min-version="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983756 4885 flags.go:64] FLAG: --tls-private-key-file="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983766 4885 flags.go:64] FLAG: --topology-manager-policy="none" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983775 4885 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983785 4885 flags.go:64] FLAG: --topology-manager-scope="container" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983794 4885 flags.go:64] FLAG: --v="2" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983808 4885 flags.go:64] FLAG: --version="false" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983820 4885 flags.go:64] FLAG: --vmodule="" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983832 4885 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.983842 4885 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984198 4885 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984210 4885 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984219 4885 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984228 4885 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984236 4885 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984246 4885 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984254 4885 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984262 4885 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984270 4885 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984277 4885 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984285 4885 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984293 4885 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984300 4885 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984308 4885 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984316 4885 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984324 4885 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984338 4885 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984349 4885 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984357 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984368 4885 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984379 4885 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984388 4885 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984396 4885 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984408 4885 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984416 4885 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984424 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984431 4885 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984439 4885 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984447 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984458 4885 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984468 4885 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984477 4885 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984485 4885 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984493 4885 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984501 4885 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984509 4885 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984517 4885 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984525 4885 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984532 4885 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984541 4885 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984548 4885 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984556 4885 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984563 4885 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984571 4885 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984579 4885 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984588 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984595 4885 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984603 4885 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984614 4885 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984624 4885 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984632 4885 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984641 4885 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984651 4885 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984661 4885 feature_gate.go:330] unrecognized feature gate: Example Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984674 4885 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984687 4885 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984697 4885 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984706 4885 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984717 4885 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984733 4885 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984745 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984755 4885 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984766 4885 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984777 4885 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984788 4885 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984796 4885 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984804 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984812 4885 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984819 4885 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984827 4885 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.984835 4885 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.984866 4885 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.996710 4885 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.996753 4885 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.996943 4885 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.996963 4885 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.996979 4885 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.996991 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997001 4885 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997010 4885 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997050 4885 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997060 4885 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997069 4885 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997078 4885 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997086 4885 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997097 4885 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997111 4885 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997120 4885 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997130 4885 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997138 4885 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997146 4885 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997156 4885 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997164 4885 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997173 4885 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997181 4885 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997192 4885 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997200 4885 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997209 4885 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997218 4885 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997226 4885 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997235 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997243 4885 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997251 4885 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997260 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997268 4885 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997277 4885 feature_gate.go:330] unrecognized feature gate: Example Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997285 4885 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997293 4885 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997301 4885 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997311 4885 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997320 4885 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997329 4885 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997338 4885 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997346 4885 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997355 4885 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997366 4885 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997378 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997388 4885 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997398 4885 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997407 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997417 4885 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997426 4885 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997435 4885 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997444 4885 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997453 4885 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997462 4885 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997470 4885 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997482 4885 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997493 4885 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997504 4885 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997514 4885 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997525 4885 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997536 4885 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997545 4885 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997555 4885 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997564 4885 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997573 4885 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997581 4885 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997589 4885 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997598 4885 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997606 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997647 4885 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997658 4885 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997669 4885 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 20:05:34 crc kubenswrapper[4885]: W1205 20:05:34.997678 4885 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 20:05:34 crc kubenswrapper[4885]: I1205 20:05:34.997692 4885 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000628 4885 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000675 4885 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000689 4885 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000702 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000714 4885 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000725 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000735 4885 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000744 4885 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000752 4885 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000761 4885 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000771 4885 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000779 4885 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000788 4885 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000797 4885 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000806 4885 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000817 4885 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000829 4885 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000839 4885 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000848 4885 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000857 4885 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000866 4885 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000875 4885 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000885 4885 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000894 4885 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000903 4885 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000913 4885 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000922 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000932 4885 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000943 4885 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000968 4885 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000978 4885 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000988 4885 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.000996 4885 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001005 4885 feature_gate.go:330] unrecognized feature gate: Example Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001014 4885 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001089 4885 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001098 4885 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001107 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001116 4885 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001125 4885 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001135 4885 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001143 4885 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001152 4885 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001160 4885 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001169 4885 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001178 4885 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001188 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001196 4885 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001205 4885 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001213 4885 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001222 4885 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001230 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001238 4885 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001247 4885 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001255 4885 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001263 4885 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001272 4885 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001280 4885 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001288 4885 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001297 4885 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001305 4885 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001314 4885 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001322 4885 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001331 4885 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001340 4885 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001348 4885 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001357 4885 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001366 4885 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001374 4885 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001382 4885 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.001390 4885 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.001406 4885 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.002007 4885 server.go:940] "Client rotation is on, will bootstrap in background" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.006596 4885 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.006742 4885 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.008254 4885 server.go:997] "Starting client certificate rotation" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.008302 4885 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.008793 4885 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-26 10:25:46.655036552 +0000 UTC Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.008888 4885 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 494h20m11.646153197s for next certificate rotation Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.020409 4885 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.023206 4885 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.033585 4885 log.go:25] "Validated CRI v1 runtime API" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.066282 4885 log.go:25] "Validated CRI v1 image API" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.068869 4885 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.072401 4885 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-05-20-00-24-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.072436 4885 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.089942 4885 manager.go:217] Machine: {Timestamp:2025-12-05 20:05:35.088224478 +0000 UTC m=+0.385040169 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5edae59e-e3c2-4636-b1d3-4225cdddd2db BootID:947d01a1-b35b-4747-9479-3c70c2147f66 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:d3:2d:e4 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:d3:2d:e4 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:1f:c9:90 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:31:fd:fc Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:6d:a7:38 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:1c:30:1b Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ae:c4:d0:70:15:36 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:4a:fd:31:72:19:33 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.090266 4885 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.090491 4885 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.091136 4885 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.091342 4885 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.091385 4885 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.091730 4885 topology_manager.go:138] "Creating topology manager with none policy" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.091750 4885 container_manager_linux.go:303] "Creating device plugin manager" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.092448 4885 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.092608 4885 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.093091 4885 state_mem.go:36] "Initialized new in-memory state store" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.094113 4885 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.095545 4885 kubelet.go:418] "Attempting to sync node with API server" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.095611 4885 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.095659 4885 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.095685 4885 kubelet.go:324] "Adding apiserver pod source" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.095706 4885 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.098167 4885 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.098835 4885 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.100344 4885 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.101261 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.101309 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.101326 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.101342 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.101367 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.101382 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.101398 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.101423 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.101439 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.101456 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.101476 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.101491 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.102174 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.102259 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Dec 05 20:05:35 crc kubenswrapper[4885]: E1205 20:05:35.102420 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:05:35 crc kubenswrapper[4885]: E1205 20:05:35.102426 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.103122 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.103912 4885 server.go:1280] "Started kubelet" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.104473 4885 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.104472 4885 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.105151 4885 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 05 20:05:35 crc systemd[1]: Started Kubernetes Kubelet. Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.106218 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.107776 4885 server.go:460] "Adding debug handlers to kubelet server" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.108776 4885 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.108878 4885 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.108898 4885 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 20:36:56.174196049 +0000 UTC Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.109189 4885 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 05 20:05:35 crc kubenswrapper[4885]: E1205 20:05:35.109204 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.109245 4885 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.109228 4885 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 05 20:05:35 crc kubenswrapper[4885]: E1205 20:05:35.107756 4885 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.164:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e6a706ecfda88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 20:05:35.103810184 +0000 UTC m=+0.400625885,LastTimestamp:2025-12-05 20:05:35.103810184 +0000 UTC m=+0.400625885,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 20:05:35 crc kubenswrapper[4885]: E1205 20:05:35.109858 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="200ms" Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.110130 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Dec 05 20:05:35 crc kubenswrapper[4885]: E1205 20:05:35.110280 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.117116 4885 factory.go:55] Registering systemd factory Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.117173 4885 factory.go:221] Registration of the systemd container factory successfully Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.120120 4885 factory.go:153] Registering CRI-O factory Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.120189 4885 factory.go:221] Registration of the crio container factory successfully Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.120323 4885 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.120360 4885 factory.go:103] Registering Raw factory Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.120384 4885 manager.go:1196] Started watching for new ooms in manager Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.124171 4885 manager.go:319] Starting recovery of all containers Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.128602 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.128839 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.128944 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.129058 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.129178 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.129304 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.129403 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.129510 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.129624 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.129743 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.129837 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.129927 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.130042 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.130174 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.130289 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.130403 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.130544 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.130655 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.130764 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.130871 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.131056 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.131174 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.131268 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.131397 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.131522 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.131654 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.131775 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.131905 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.132044 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.132196 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.132320 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.132434 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.132551 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.132661 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.132766 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.132874 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.132997 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.133136 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.133252 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.133362 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.133492 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.133609 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.133731 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.133839 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.133954 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.134092 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.134228 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.134343 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.134457 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.134575 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.134687 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.134812 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.134935 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.135102 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.135222 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.135330 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.135435 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.135543 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.135662 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.135781 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.135900 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.136012 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.136192 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.136305 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.136413 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.136530 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.136646 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.136771 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.136889 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.136996 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.137134 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.137254 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.137372 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.137498 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.137610 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.137717 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.137823 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.137932 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.138078 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.138195 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.138302 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.138412 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.138525 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.138649 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.138760 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.138878 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.139293 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.139412 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.139538 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.139657 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.139767 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.139880 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.139997 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.140145 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.140268 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.140367 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.140462 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.140565 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.140673 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.140782 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.140880 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.141304 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.141387 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.141514 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.141614 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.141704 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.141789 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.141879 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.141986 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.142194 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.142323 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.142448 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.142561 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.142676 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.142788 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.142907 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.143071 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.143181 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.143296 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.143407 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.145564 4885 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.145713 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.146942 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147142 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147184 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147220 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147255 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147283 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147314 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147341 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147368 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147395 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147428 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147538 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147568 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147597 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147628 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147660 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147694 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147723 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147753 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147799 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147828 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147856 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147887 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147916 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147945 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.147977 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148007 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148069 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148098 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148125 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148153 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148179 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148207 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148233 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148258 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148284 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148311 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148338 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148364 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148389 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148418 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148452 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148480 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148510 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148536 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148562 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148591 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148618 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148646 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148675 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148701 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148726 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148750 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148776 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148806 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148831 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148857 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148886 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148914 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148940 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148965 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.148992 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149051 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149080 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149110 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149137 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149162 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149187 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149213 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149240 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149268 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149294 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149318 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149345 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149373 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149400 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149428 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149457 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149486 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149515 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149540 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149565 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149593 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149622 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149650 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149679 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149704 4885 reconstruct.go:97] "Volume reconstruction finished" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.149724 4885 reconciler.go:26] "Reconciler: start to sync state" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.160479 4885 manager.go:324] Recovery completed Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.169674 4885 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.171401 4885 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.171439 4885 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.171468 4885 kubelet.go:2335] "Starting kubelet main sync loop" Dec 05 20:05:35 crc kubenswrapper[4885]: E1205 20:05:35.171514 4885 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.172733 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Dec 05 20:05:35 crc kubenswrapper[4885]: E1205 20:05:35.172825 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.176782 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.178746 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.178800 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.178813 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.180269 4885 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.180298 4885 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.180320 4885 state_mem.go:36] "Initialized new in-memory state store" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.191143 4885 policy_none.go:49] "None policy: Start" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.191643 4885 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.191667 4885 state_mem.go:35] "Initializing new in-memory state store" Dec 05 20:05:35 crc kubenswrapper[4885]: E1205 20:05:35.210012 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.245401 4885 manager.go:334] "Starting Device Plugin manager" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.245482 4885 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.245498 4885 server.go:79] "Starting device plugin registration server" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.245930 4885 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.245950 4885 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.246138 4885 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.246233 4885 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.246248 4885 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 05 20:05:35 crc kubenswrapper[4885]: E1205 20:05:35.253037 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.272313 4885 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.272457 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.273564 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.273611 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.273622 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.273813 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.274099 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.274136 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.274858 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.274871 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.274881 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.274886 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.274893 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.274898 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.275248 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.275338 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.275370 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.275888 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.275931 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.275938 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.275947 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.275951 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.275957 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.276061 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.276191 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.276212 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.277757 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.277779 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.277786 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.278193 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.278305 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.278338 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.278371 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.278396 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.278405 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.278983 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.279012 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.279050 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.279207 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.279249 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.279260 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.279414 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.279454 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.280030 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.280056 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.280067 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:35 crc kubenswrapper[4885]: E1205 20:05:35.311377 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="400ms" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.346442 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.347668 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.347705 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.347746 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.347781 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:05:35 crc kubenswrapper[4885]: E1205 20:05:35.348648 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.351384 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.351439 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.351548 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.351636 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.351674 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.351701 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.351797 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.351866 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.351923 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.351960 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.351994 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.352044 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.352140 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.352181 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.352215 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.453923 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454065 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454108 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454143 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454183 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454214 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454236 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454313 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454343 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454388 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454252 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454334 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454249 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454322 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454497 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454551 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454596 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454617 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454624 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454642 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454672 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454685 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454731 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454741 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454774 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454799 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454809 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454829 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454824 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.454876 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.549366 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.551677 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.551721 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.551735 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.551761 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:05:35 crc kubenswrapper[4885]: E1205 20:05:35.552277 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.599466 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.609437 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.615369 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.644429 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-66f33cdbcfd917df5c1dce0d56a5dc55f100eb45147675dd6ad48a19b39e8555 WatchSource:0}: Error finding container 66f33cdbcfd917df5c1dce0d56a5dc55f100eb45147675dd6ad48a19b39e8555: Status 404 returned error can't find the container with id 66f33cdbcfd917df5c1dce0d56a5dc55f100eb45147675dd6ad48a19b39e8555 Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.649930 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-2b02ae77bb0a4af68c12e3021e0ae246de7c7772731dae8f441d45407799b8b6 WatchSource:0}: Error finding container 2b02ae77bb0a4af68c12e3021e0ae246de7c7772731dae8f441d45407799b8b6: Status 404 returned error can't find the container with id 2b02ae77bb0a4af68c12e3021e0ae246de7c7772731dae8f441d45407799b8b6 Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.654549 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.665169 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.687115 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-0b059432abb5dc7f0f0d5d6ae2610d3ce239293924b6aac8b0f17ac0b0e9360d WatchSource:0}: Error finding container 0b059432abb5dc7f0f0d5d6ae2610d3ce239293924b6aac8b0f17ac0b0e9360d: Status 404 returned error can't find the container with id 0b059432abb5dc7f0f0d5d6ae2610d3ce239293924b6aac8b0f17ac0b0e9360d Dec 05 20:05:35 crc kubenswrapper[4885]: W1205 20:05:35.693100 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-29ec91a41a8a498a185c72b47c193ad587344d4c3844a0ec5d754961b92307cd WatchSource:0}: Error finding container 29ec91a41a8a498a185c72b47c193ad587344d4c3844a0ec5d754961b92307cd: Status 404 returned error can't find the container with id 29ec91a41a8a498a185c72b47c193ad587344d4c3844a0ec5d754961b92307cd Dec 05 20:05:35 crc kubenswrapper[4885]: E1205 20:05:35.712193 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="800ms" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.953204 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.955405 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.955454 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.955465 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:35 crc kubenswrapper[4885]: I1205 20:05:35.955503 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:05:35 crc kubenswrapper[4885]: E1205 20:05:35.956137 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.108846 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.109062 4885 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 05:22:25.486154862 +0000 UTC Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.109099 4885 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 177h16m49.377058137s for next certificate rotation Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.177489 4885 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="06da11659e81059864ccf0d3b74a3d75dde58fe19886ba86ac2bb69176a9b634" exitCode=0 Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.177592 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"06da11659e81059864ccf0d3b74a3d75dde58fe19886ba86ac2bb69176a9b634"} Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.177766 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2b02ae77bb0a4af68c12e3021e0ae246de7c7772731dae8f441d45407799b8b6"} Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.177900 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.179084 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.179127 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.179144 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.179251 4885 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709" exitCode=0 Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.179349 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709"} Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.179398 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"288f5712c845c597875ccf5f8536fe5946d562ceee9c02a776d88dae9ffca8f8"} Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.179488 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.180496 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.180534 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.180547 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.181810 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348"} Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.181846 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"66f33cdbcfd917df5c1dce0d56a5dc55f100eb45147675dd6ad48a19b39e8555"} Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.184163 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d" exitCode=0 Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.184244 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d"} Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.184289 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"29ec91a41a8a498a185c72b47c193ad587344d4c3844a0ec5d754961b92307cd"} Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.184402 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.185300 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.185324 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.185334 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.185744 4885 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ff814371577f4f6e2addff05065a1ec70a537aca1e1da2e5a35b1d95e1b0a9e4" exitCode=0 Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.185784 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ff814371577f4f6e2addff05065a1ec70a537aca1e1da2e5a35b1d95e1b0a9e4"} Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.185806 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0b059432abb5dc7f0f0d5d6ae2610d3ce239293924b6aac8b0f17ac0b0e9360d"} Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.185893 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.186660 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.186704 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.186723 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.187070 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.187596 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.187615 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.187624 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:36 crc kubenswrapper[4885]: W1205 20:05:36.346978 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Dec 05 20:05:36 crc kubenswrapper[4885]: E1205 20:05:36.347092 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:05:36 crc kubenswrapper[4885]: E1205 20:05:36.514175 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="1.6s" Dec 05 20:05:36 crc kubenswrapper[4885]: W1205 20:05:36.651428 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Dec 05 20:05:36 crc kubenswrapper[4885]: E1205 20:05:36.651524 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:05:36 crc kubenswrapper[4885]: W1205 20:05:36.675905 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Dec 05 20:05:36 crc kubenswrapper[4885]: E1205 20:05:36.675991 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:05:36 crc kubenswrapper[4885]: W1205 20:05:36.697794 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Dec 05 20:05:36 crc kubenswrapper[4885]: E1205 20:05:36.698651 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.757280 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.758910 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.758959 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.758970 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:36 crc kubenswrapper[4885]: I1205 20:05:36.758993 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:05:36 crc kubenswrapper[4885]: E1205 20:05:36.759464 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.189303 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"51d67c4663bd1eb264333634d2d038485ad69fee90aaa130ce3b7b51331a4c35"} Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.189401 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.190179 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.190517 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.190527 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.191819 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c5203d8faff0bf21cb02982db400e7803cbbd1caa8febda97f8b0c4cea1dcc48"} Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.191849 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2407c87ee202205691e8650387a082757f38bbfc3271575f6936d1b25f81ecda"} Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.191869 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c55d112bc62087d911c13b8a28f8d3d57d83b8a3946f4d5003592be953f5bac0"} Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.191930 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.192720 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.192758 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.192769 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.194140 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735"} Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.194164 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617"} Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.194176 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5"} Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.194246 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.194945 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.194965 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.194988 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.197990 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe"} Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.198014 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630"} Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.198121 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84"} Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.198133 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61"} Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.199533 4885 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f73ce4a6e8fb06b0716bec50ddad8c0aa971176004b1b743a09266f688e0ea01" exitCode=0 Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.199584 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f73ce4a6e8fb06b0716bec50ddad8c0aa971176004b1b743a09266f688e0ea01"} Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.199732 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.200462 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.200491 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:37 crc kubenswrapper[4885]: I1205 20:05:37.200502 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.208974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982"} Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.209122 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.210621 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.210679 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.210704 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.212656 4885 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0b0f237c8ed322ea4af71a89e6a1f69ad84df2c91ef413ebcde02e0601c9d005" exitCode=0 Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.212773 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0b0f237c8ed322ea4af71a89e6a1f69ad84df2c91ef413ebcde02e0601c9d005"} Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.212952 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.212950 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.217367 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.217450 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.217488 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.217863 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.218150 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.218790 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.359705 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.361670 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.361743 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.361763 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.361802 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:05:38 crc kubenswrapper[4885]: I1205 20:05:38.717603 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.218696 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c4faef65793dcc78afed1b6b7677caa46ebe1578b77ec955101ecaebdb860f1d"} Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.218749 4885 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.218776 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.218789 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.218749 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d7c6c46ec94cbf50d04ee3bb5f66bf8b49a0ecbb8c7d0c2d66e94f8f6408d69a"} Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.219703 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"48456018ab473430deb463c3349bc700a0f2333d622c723da0751ed10f1e7fa0"} Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.220088 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.220125 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.220142 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.220367 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.220392 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.220406 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.236193 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.236350 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.237458 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.237496 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.237512 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:39 crc kubenswrapper[4885]: I1205 20:05:39.719335 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.227840 4885 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.227875 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.227958 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8807af07a9cbe7820b612ead962677bb60df9262e1ec0218e5fe08a86b2d1258"} Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.227902 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.228063 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8071db29348d1f568606112bf008fc47810b275448b953f473a30d8a9ca83868"} Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.229256 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.229300 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.229316 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.230274 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.230312 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.230329 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.340350 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.340549 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.342898 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.342959 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.342978 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:40 crc kubenswrapper[4885]: I1205 20:05:40.851324 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:41 crc kubenswrapper[4885]: I1205 20:05:41.231161 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:41 crc kubenswrapper[4885]: I1205 20:05:41.231255 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:41 crc kubenswrapper[4885]: I1205 20:05:41.232146 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:41 crc kubenswrapper[4885]: I1205 20:05:41.232173 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:41 crc kubenswrapper[4885]: I1205 20:05:41.232182 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:41 crc kubenswrapper[4885]: I1205 20:05:41.233149 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:41 crc kubenswrapper[4885]: I1205 20:05:41.233170 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:41 crc kubenswrapper[4885]: I1205 20:05:41.233179 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:41 crc kubenswrapper[4885]: I1205 20:05:41.718100 4885 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 20:05:41 crc kubenswrapper[4885]: I1205 20:05:41.718192 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:05:41 crc kubenswrapper[4885]: I1205 20:05:41.732275 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:42 crc kubenswrapper[4885]: I1205 20:05:42.233533 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:42 crc kubenswrapper[4885]: I1205 20:05:42.234593 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:42 crc kubenswrapper[4885]: I1205 20:05:42.234659 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:42 crc kubenswrapper[4885]: I1205 20:05:42.234691 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:44 crc kubenswrapper[4885]: I1205 20:05:44.157524 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 05 20:05:44 crc kubenswrapper[4885]: I1205 20:05:44.157800 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:44 crc kubenswrapper[4885]: I1205 20:05:44.159545 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:44 crc kubenswrapper[4885]: I1205 20:05:44.159585 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:44 crc kubenswrapper[4885]: I1205 20:05:44.159595 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:44 crc kubenswrapper[4885]: I1205 20:05:44.507838 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:44 crc kubenswrapper[4885]: I1205 20:05:44.508138 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:44 crc kubenswrapper[4885]: I1205 20:05:44.509587 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:44 crc kubenswrapper[4885]: I1205 20:05:44.509662 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:44 crc kubenswrapper[4885]: I1205 20:05:44.509692 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:45 crc kubenswrapper[4885]: E1205 20:05:45.253218 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 20:05:46 crc kubenswrapper[4885]: I1205 20:05:46.326060 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:46 crc kubenswrapper[4885]: I1205 20:05:46.326271 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:46 crc kubenswrapper[4885]: I1205 20:05:46.327524 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:46 crc kubenswrapper[4885]: I1205 20:05:46.327563 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:46 crc kubenswrapper[4885]: I1205 20:05:46.327575 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:46 crc kubenswrapper[4885]: I1205 20:05:46.332194 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:47 crc kubenswrapper[4885]: I1205 20:05:47.107934 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 05 20:05:47 crc kubenswrapper[4885]: I1205 20:05:47.245698 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:47 crc kubenswrapper[4885]: I1205 20:05:47.246577 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:47 crc kubenswrapper[4885]: I1205 20:05:47.246615 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:47 crc kubenswrapper[4885]: I1205 20:05:47.246628 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:47 crc kubenswrapper[4885]: I1205 20:05:47.250201 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:47 crc kubenswrapper[4885]: I1205 20:05:47.701364 4885 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 20:05:47 crc kubenswrapper[4885]: I1205 20:05:47.701444 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 20:05:47 crc kubenswrapper[4885]: I1205 20:05:47.705679 4885 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 20:05:47 crc kubenswrapper[4885]: I1205 20:05:47.705744 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 20:05:48 crc kubenswrapper[4885]: I1205 20:05:48.251962 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:48 crc kubenswrapper[4885]: I1205 20:05:48.253406 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:48 crc kubenswrapper[4885]: I1205 20:05:48.253475 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:48 crc kubenswrapper[4885]: I1205 20:05:48.253500 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:48 crc kubenswrapper[4885]: I1205 20:05:48.596328 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 05 20:05:48 crc kubenswrapper[4885]: I1205 20:05:48.596773 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:48 crc kubenswrapper[4885]: I1205 20:05:48.598751 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:48 crc kubenswrapper[4885]: I1205 20:05:48.598848 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:48 crc kubenswrapper[4885]: I1205 20:05:48.598874 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:48 crc kubenswrapper[4885]: I1205 20:05:48.638528 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 05 20:05:49 crc kubenswrapper[4885]: I1205 20:05:49.176082 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 05 20:05:49 crc kubenswrapper[4885]: I1205 20:05:49.255233 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:49 crc kubenswrapper[4885]: I1205 20:05:49.256577 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:49 crc kubenswrapper[4885]: I1205 20:05:49.256623 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:49 crc kubenswrapper[4885]: I1205 20:05:49.256636 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:50 crc kubenswrapper[4885]: I1205 20:05:50.257969 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:50 crc kubenswrapper[4885]: I1205 20:05:50.259376 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:50 crc kubenswrapper[4885]: I1205 20:05:50.259437 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:50 crc kubenswrapper[4885]: I1205 20:05:50.259451 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:51 crc kubenswrapper[4885]: I1205 20:05:51.718781 4885 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 20:05:51 crc kubenswrapper[4885]: I1205 20:05:51.718918 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:05:51 crc kubenswrapper[4885]: I1205 20:05:51.740938 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:51 crc kubenswrapper[4885]: I1205 20:05:51.741246 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:51 crc kubenswrapper[4885]: I1205 20:05:51.742857 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:51 crc kubenswrapper[4885]: I1205 20:05:51.742915 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:51 crc kubenswrapper[4885]: I1205 20:05:51.742925 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:51 crc kubenswrapper[4885]: I1205 20:05:51.748900 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:52 crc kubenswrapper[4885]: I1205 20:05:52.264098 4885 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:05:52 crc kubenswrapper[4885]: I1205 20:05:52.264201 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:52 crc kubenswrapper[4885]: I1205 20:05:52.265520 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:52 crc kubenswrapper[4885]: I1205 20:05:52.265589 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:52 crc kubenswrapper[4885]: I1205 20:05:52.265613 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:52 crc kubenswrapper[4885]: E1205 20:05:52.692699 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 05 20:05:52 crc kubenswrapper[4885]: I1205 20:05:52.694556 4885 trace.go:236] Trace[1880018207]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 20:05:38.809) (total time: 13884ms): Dec 05 20:05:52 crc kubenswrapper[4885]: Trace[1880018207]: ---"Objects listed" error: 13884ms (20:05:52.694) Dec 05 20:05:52 crc kubenswrapper[4885]: Trace[1880018207]: [13.884662171s] [13.884662171s] END Dec 05 20:05:52 crc kubenswrapper[4885]: I1205 20:05:52.694603 4885 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 20:05:52 crc kubenswrapper[4885]: E1205 20:05:52.696800 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 05 20:05:52 crc kubenswrapper[4885]: I1205 20:05:52.697699 4885 trace.go:236] Trace[409682982]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 20:05:38.836) (total time: 13860ms): Dec 05 20:05:52 crc kubenswrapper[4885]: Trace[409682982]: ---"Objects listed" error: 13860ms (20:05:52.697) Dec 05 20:05:52 crc kubenswrapper[4885]: Trace[409682982]: [13.860742281s] [13.860742281s] END Dec 05 20:05:52 crc kubenswrapper[4885]: I1205 20:05:52.697734 4885 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 20:05:52 crc kubenswrapper[4885]: I1205 20:05:52.698197 4885 trace.go:236] Trace[1595040675]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 20:05:38.691) (total time: 14006ms): Dec 05 20:05:52 crc kubenswrapper[4885]: Trace[1595040675]: ---"Objects listed" error: 14006ms (20:05:52.698) Dec 05 20:05:52 crc kubenswrapper[4885]: Trace[1595040675]: [14.006729828s] [14.006729828s] END Dec 05 20:05:52 crc kubenswrapper[4885]: I1205 20:05:52.698236 4885 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 20:05:52 crc kubenswrapper[4885]: I1205 20:05:52.698254 4885 trace.go:236] Trace[941826119]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 20:05:39.545) (total time: 13152ms): Dec 05 20:05:52 crc kubenswrapper[4885]: Trace[941826119]: ---"Objects listed" error: 13152ms (20:05:52.698) Dec 05 20:05:52 crc kubenswrapper[4885]: Trace[941826119]: [13.152839657s] [13.152839657s] END Dec 05 20:05:52 crc kubenswrapper[4885]: I1205 20:05:52.698279 4885 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 20:05:52 crc kubenswrapper[4885]: I1205 20:05:52.700690 4885 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.107822 4885 apiserver.go:52] "Watching apiserver" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.111278 4885 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.111619 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.112207 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.112313 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.112401 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.112538 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.112576 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.112918 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.112980 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.113181 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.113244 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.115617 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.115778 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.116004 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.116210 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.116210 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.116257 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.116425 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.117142 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.117225 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.138855 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.151788 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.161780 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.170761 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.192373 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.201563 4885 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60594->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.201614 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60594->192.168.126.11:17697: read: connection reset by peer" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.201601 4885 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60592->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.201794 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60592->192.168.126.11:17697: read: connection reset by peer" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.201944 4885 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.202006 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.210936 4885 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.220842 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.242366 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.261253 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.268619 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.270607 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982" exitCode=255 Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.270643 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982"} Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.281571 4885 scope.go:117] "RemoveContainer" containerID="e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.282057 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.290133 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303295 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303336 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303370 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303393 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303421 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303461 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303489 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303512 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303536 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303558 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303582 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303603 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303628 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303653 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303682 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303710 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303737 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303765 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303789 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303812 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303837 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303890 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303915 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303942 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303966 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303987 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304010 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304065 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304113 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304136 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304161 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304216 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304238 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304258 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304280 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304298 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304314 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304331 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304351 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304368 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304382 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304411 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304425 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304440 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304455 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304470 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304485 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304501 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304517 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304531 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304544 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304558 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304575 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304590 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304606 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304620 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304648 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304668 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304682 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304696 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304711 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304727 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304742 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304757 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304774 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304791 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304805 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304820 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304837 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304853 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304870 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304885 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304913 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304927 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304941 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304958 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304973 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305103 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305122 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305139 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305154 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305169 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305185 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305202 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305216 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305232 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305248 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305264 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305281 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305296 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305311 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305326 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305341 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305358 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305374 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305391 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305409 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305427 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305446 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305465 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305483 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305502 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305521 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305538 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305553 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305568 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305582 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305598 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305613 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305627 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305641 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305658 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305675 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305690 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305705 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305720 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305740 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305755 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305772 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305788 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305804 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305848 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305863 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305878 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305896 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305912 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305929 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305946 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305962 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305978 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305993 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.306009 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.306039 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.306057 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.306073 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.306090 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.306110 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.306128 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.306144 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.306161 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308090 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308182 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308229 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308275 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308307 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308335 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308363 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308409 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308437 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308462 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308488 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308519 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308550 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308587 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308607 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308634 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308666 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308702 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308728 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309115 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309155 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309177 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309202 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309229 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309255 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309287 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309383 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309433 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309476 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309510 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309540 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309571 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309619 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309665 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309693 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309728 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309757 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309784 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309813 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309849 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309889 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309923 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309955 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309983 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310031 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310057 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310082 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310107 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310132 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310162 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310192 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310225 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310258 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310292 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310320 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310409 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310454 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310486 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310516 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310558 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310591 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310623 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310649 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310676 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310702 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310733 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310761 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310791 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310824 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.326839 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.338650 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303642 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303790 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.339251 4885 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.339230 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.303920 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304071 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304226 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304345 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.304471 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305243 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305507 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305640 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305765 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.305894 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.306131 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.306941 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.307251 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.307563 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.307700 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.307758 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.307921 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.307963 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308184 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308190 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308208 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308123 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308474 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308719 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.308901 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309032 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309354 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309598 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309626 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309963 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310070 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.309974 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310314 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310669 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310704 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310786 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.310920 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.311120 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.311328 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.312709 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.314503 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.314621 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.314634 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.314736 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.315029 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.315166 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.315277 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.315261 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.315540 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.315578 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.315646 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.315768 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.315781 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.315919 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.316055 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.315747 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.316611 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.316676 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.316961 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.318854 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.319255 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.319766 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.319936 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.319952 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.319833 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.320161 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.320187 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.320254 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.320294 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.320540 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.320546 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.320814 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.321205 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.321247 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.321414 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.322081 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.322459 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.323270 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.323496 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.323479 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.323596 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.323608 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.323347 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.324300 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.324399 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.324447 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.324565 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.324591 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.323856 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.324457 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.326141 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.326161 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.326342 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.326761 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.326973 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.327138 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.327205 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.327368 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.327472 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.327701 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.327731 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.327754 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.327991 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.328139 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.328361 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.328458 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.328488 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.328533 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.328385 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.328640 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.328894 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.329003 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.340106 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.329278 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.329298 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.329326 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.329423 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.329436 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.329446 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.329453 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.329585 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.329911 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.329930 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.329863 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.330052 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.330173 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.330251 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.330713 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.330716 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.331099 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.331359 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.331614 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.331624 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.331810 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.331999 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.332006 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.332218 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.332284 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.332530 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.332787 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.333076 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.333161 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:05:53.833141737 +0000 UTC m=+19.129957398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.335054 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.335755 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.340324 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.335834 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.336220 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.336427 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.336635 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.338192 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.340388 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.340676 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.340726 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:05:53.840706986 +0000 UTC m=+19.137522647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.340756 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:05:53.840744668 +0000 UTC m=+19.137560449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.340942 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.341358 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.341365 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.341730 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.341795 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.341882 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.342269 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.342892 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.342946 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.343265 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.343367 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.343818 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.343975 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.344112 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.355005 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.344185 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.354420 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.348097 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.355133 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.353014 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.355198 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.350683 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.355276 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.355305 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.355308 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.355332 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.354953 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.355327 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:05:53.855278746 +0000 UTC m=+19.152094597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.355349 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.355003 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.355465 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:05:53.855448261 +0000 UTC m=+19.152263922 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.355867 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.356294 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.356610 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.356763 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.356818 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.358007 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.361682 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.362116 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.362279 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.362409 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.362630 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.362729 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.362771 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.362816 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.363745 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.363821 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.364163 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.364288 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.364521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.365459 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.370364 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.370774 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.371230 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.376203 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.378187 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.379547 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.380989 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.386988 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.390155 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.397949 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412079 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412122 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412225 4885 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412237 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412248 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412257 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412266 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412276 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412284 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412292 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412301 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412311 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412319 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412328 4885 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412337 4885 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412346 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412355 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412363 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412381 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412390 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412398 4885 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412406 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412414 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412423 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412431 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412439 4885 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412448 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412456 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412464 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412471 4885 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412480 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412488 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412496 4885 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412505 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412513 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412522 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412531 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412540 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412547 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412555 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412563 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412570 4885 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412579 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412587 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412595 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412604 4885 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412612 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412620 4885 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412630 4885 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412639 4885 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412647 4885 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412656 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412664 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412672 4885 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412679 4885 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412686 4885 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412694 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412702 4885 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412711 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412720 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412730 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412739 4885 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412749 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412756 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412765 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412774 4885 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412782 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412790 4885 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412798 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412806 4885 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412813 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412822 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412830 4885 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412838 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412846 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412854 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412862 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412870 4885 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412878 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412885 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412893 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412901 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412909 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412917 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412925 4885 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412933 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412941 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412949 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412958 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412966 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412974 4885 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412981 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412989 4885 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.412997 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413005 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413012 4885 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413034 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413042 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413050 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413057 4885 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413066 4885 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413075 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413084 4885 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413092 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413100 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413107 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413117 4885 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413125 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413133 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413141 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413150 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413157 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413170 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413178 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413185 4885 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413193 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413201 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413209 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413217 4885 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413224 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413231 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413239 4885 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413249 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413257 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413266 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413274 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413281 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413291 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413299 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413308 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413315 4885 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413322 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413330 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413338 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413345 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413353 4885 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413360 4885 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413368 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413375 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413383 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413390 4885 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413398 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413405 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413413 4885 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413422 4885 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413429 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413436 4885 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413443 4885 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413452 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413459 4885 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413467 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413475 4885 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413482 4885 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413490 4885 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413498 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413506 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413513 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413521 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413528 4885 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413536 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413544 4885 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413552 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413559 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413568 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413575 4885 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413583 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413592 4885 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413601 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413609 4885 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413618 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413627 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413636 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413646 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413654 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413662 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413670 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413681 4885 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413689 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413697 4885 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413706 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413713 4885 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413721 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413729 4885 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413737 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413746 4885 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413753 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413761 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413769 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413776 4885 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413785 4885 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413793 4885 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413800 4885 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413808 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413816 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413823 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413832 4885 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413925 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.413959 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.425217 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.433760 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.438892 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:05:53 crc kubenswrapper[4885]: W1205 20:05:53.453638 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-6dfcdcb3a827ac21cb936eaf06bfa8a490f8a2350673a77bba366cee71a4103d WatchSource:0}: Error finding container 6dfcdcb3a827ac21cb936eaf06bfa8a490f8a2350673a77bba366cee71a4103d: Status 404 returned error can't find the container with id 6dfcdcb3a827ac21cb936eaf06bfa8a490f8a2350673a77bba366cee71a4103d Dec 05 20:05:53 crc kubenswrapper[4885]: W1205 20:05:53.453889 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-a2d242e8626fcd84b953d6950ab3c9e923e9a65b690c8505514de8acd4b58604 WatchSource:0}: Error finding container a2d242e8626fcd84b953d6950ab3c9e923e9a65b690c8505514de8acd4b58604: Status 404 returned error can't find the container with id a2d242e8626fcd84b953d6950ab3c9e923e9a65b690c8505514de8acd4b58604 Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.918826 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.918928 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.918961 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.918984 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:05:53 crc kubenswrapper[4885]: I1205 20:05:53.919013 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.919089 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:05:54.919063907 +0000 UTC m=+20.215879568 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.919196 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.919244 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.919288 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:05:54.919260703 +0000 UTC m=+20.216076444 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.919296 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.919307 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.919311 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.919332 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.919336 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.919368 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:05:54.919360107 +0000 UTC m=+20.216175768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.919251 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.919437 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:05:54.919410668 +0000 UTC m=+20.216226379 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:05:53 crc kubenswrapper[4885]: E1205 20:05:53.919503 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:05:54.919485921 +0000 UTC m=+20.216301632 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.289005 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9645684d59d3b986b9787ee29799f4b51943aed705890dd4bf96d149aa5d6172"} Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.290599 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9"} Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.290633 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0"} Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.290645 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a2d242e8626fcd84b953d6950ab3c9e923e9a65b690c8505514de8acd4b58604"} Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.292130 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad"} Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.292162 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6dfcdcb3a827ac21cb936eaf06bfa8a490f8a2350673a77bba366cee71a4103d"} Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.294258 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.295454 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6"} Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.295943 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.320497 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.336919 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.352190 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.369261 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.385201 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.401962 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.422309 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.437141 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.453181 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.467070 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.479196 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.495714 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.542209 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.586623 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.926678 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.926825 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:05:54 crc kubenswrapper[4885]: E1205 20:05:54.926922 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:05:56.926880541 +0000 UTC m=+22.223696202 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:05:54 crc kubenswrapper[4885]: E1205 20:05:54.926957 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:05:54 crc kubenswrapper[4885]: E1205 20:05:54.927000 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.927007 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.927057 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:05:54 crc kubenswrapper[4885]: I1205 20:05:54.927082 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:05:54 crc kubenswrapper[4885]: E1205 20:05:54.927034 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:05:54 crc kubenswrapper[4885]: E1205 20:05:54.927196 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:05:56.92716927 +0000 UTC m=+22.223985101 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:05:54 crc kubenswrapper[4885]: E1205 20:05:54.927235 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:05:54 crc kubenswrapper[4885]: E1205 20:05:54.927282 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:05:56.927274713 +0000 UTC m=+22.224090374 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:05:54 crc kubenswrapper[4885]: E1205 20:05:54.927318 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:05:54 crc kubenswrapper[4885]: E1205 20:05:54.927345 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:05:56.927337205 +0000 UTC m=+22.224153086 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:05:54 crc kubenswrapper[4885]: E1205 20:05:54.927413 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:05:54 crc kubenswrapper[4885]: E1205 20:05:54.927425 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:05:54 crc kubenswrapper[4885]: E1205 20:05:54.927437 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:05:54 crc kubenswrapper[4885]: E1205 20:05:54.927465 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:05:56.927456379 +0000 UTC m=+22.224272040 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.172370 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.172480 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.172493 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:05:55 crc kubenswrapper[4885]: E1205 20:05:55.172592 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:05:55 crc kubenswrapper[4885]: E1205 20:05:55.172745 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:05:55 crc kubenswrapper[4885]: E1205 20:05:55.172864 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.177957 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.179110 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.181392 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.182836 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.183788 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.184655 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.186676 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.188057 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.189732 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.189766 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.191056 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.192282 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.193871 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.195421 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.197325 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.198131 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.198860 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.199906 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.200465 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.201722 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.202450 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.203037 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.203647 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.205178 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.205189 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.205966 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.206843 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.207467 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.208632 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.209177 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.209826 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.210775 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.211342 4885 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.211450 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.213628 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.214465 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.215054 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.217196 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.217841 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.218385 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.219471 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.220603 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.221165 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.221390 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.221889 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.223170 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.224232 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.224747 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.225752 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.226331 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.227994 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.229377 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.231423 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.232345 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.232946 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.233639 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.234138 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.236860 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.257967 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.275870 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.312726 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.897478 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.900119 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.900154 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.900167 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.900243 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.910152 4885 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.910635 4885 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.912172 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.912225 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.912235 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.912257 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.912272 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:55Z","lastTransitionTime":"2025-12-05T20:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:55 crc kubenswrapper[4885]: E1205 20:05:55.934400 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.940126 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.940173 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.940183 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.940199 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.940212 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:55Z","lastTransitionTime":"2025-12-05T20:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:55 crc kubenswrapper[4885]: E1205 20:05:55.955559 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.960467 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.960518 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.960534 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.960558 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.960576 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:55Z","lastTransitionTime":"2025-12-05T20:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:55 crc kubenswrapper[4885]: E1205 20:05:55.976668 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.982681 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.982753 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.982767 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.982788 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:55 crc kubenswrapper[4885]: I1205 20:05:55.982801 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:55Z","lastTransitionTime":"2025-12-05T20:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:56 crc kubenswrapper[4885]: E1205 20:05:56.019169 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.023810 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.023850 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.023862 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.023880 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.023895 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:56Z","lastTransitionTime":"2025-12-05T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:56 crc kubenswrapper[4885]: E1205 20:05:56.041977 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:56 crc kubenswrapper[4885]: E1205 20:05:56.042138 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.043962 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.043996 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.044007 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.044045 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.044062 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:56Z","lastTransitionTime":"2025-12-05T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.147003 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.147081 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.147092 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.147115 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.147127 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:56Z","lastTransitionTime":"2025-12-05T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.249909 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.249952 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.249961 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.249978 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.249991 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:56Z","lastTransitionTime":"2025-12-05T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.352751 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.353291 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.353399 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.353492 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.353551 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:56Z","lastTransitionTime":"2025-12-05T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.456984 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.457058 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.457067 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.457086 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.457098 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:56Z","lastTransitionTime":"2025-12-05T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.560256 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.560316 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.560329 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.560349 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.560366 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:56Z","lastTransitionTime":"2025-12-05T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.663219 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.663292 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.663302 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.663318 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.663329 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:56Z","lastTransitionTime":"2025-12-05T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.765836 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.765909 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.765922 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.765943 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.765959 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:56Z","lastTransitionTime":"2025-12-05T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.868538 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.868587 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.868599 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.868613 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.868624 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:56Z","lastTransitionTime":"2025-12-05T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.945467 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.945573 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.945601 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.945621 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.945652 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:05:56 crc kubenswrapper[4885]: E1205 20:05:56.945714 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:06:00.945679758 +0000 UTC m=+26.242495419 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:05:56 crc kubenswrapper[4885]: E1205 20:05:56.945818 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:05:56 crc kubenswrapper[4885]: E1205 20:05:56.945864 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:05:56 crc kubenswrapper[4885]: E1205 20:05:56.945866 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:05:56 crc kubenswrapper[4885]: E1205 20:05:56.945888 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:05:56 crc kubenswrapper[4885]: E1205 20:05:56.945990 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:05:56 crc kubenswrapper[4885]: E1205 20:05:56.945990 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:05:56 crc kubenswrapper[4885]: E1205 20:05:56.946057 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:05:56 crc kubenswrapper[4885]: E1205 20:05:56.946074 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:05:56 crc kubenswrapper[4885]: E1205 20:05:56.945900 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:06:00.945877764 +0000 UTC m=+26.242693425 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:05:56 crc kubenswrapper[4885]: E1205 20:05:56.946235 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:06:00.946153703 +0000 UTC m=+26.242969364 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:05:56 crc kubenswrapper[4885]: E1205 20:05:56.946274 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:06:00.946262936 +0000 UTC m=+26.243078597 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:05:56 crc kubenswrapper[4885]: E1205 20:05:56.946296 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:06:00.946288427 +0000 UTC m=+26.243104088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.971337 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.971396 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.971408 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.971429 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:56 crc kubenswrapper[4885]: I1205 20:05:56.971443 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:56Z","lastTransitionTime":"2025-12-05T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.074563 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.074597 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.074607 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.074622 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.074632 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:57Z","lastTransitionTime":"2025-12-05T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.172789 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.172848 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.172789 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:05:57 crc kubenswrapper[4885]: E1205 20:05:57.172929 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:05:57 crc kubenswrapper[4885]: E1205 20:05:57.173032 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:05:57 crc kubenswrapper[4885]: E1205 20:05:57.173112 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.177349 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.177378 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.177389 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.177405 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.177415 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:57Z","lastTransitionTime":"2025-12-05T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.279984 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.280058 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.280072 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.280090 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.280104 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:57Z","lastTransitionTime":"2025-12-05T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.318251 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b"} Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.329231 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-msl9r"] Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.329577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-msl9r" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.334421 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.334691 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.335154 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.348499 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.348710 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a3c5536c-62ec-4ca4-938c-1e0322c676b4-hosts-file\") pod \"node-resolver-msl9r\" (UID: \"a3c5536c-62ec-4ca4-938c-1e0322c676b4\") " pod="openshift-dns/node-resolver-msl9r" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.348861 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbwlb\" (UniqueName: \"kubernetes.io/projected/a3c5536c-62ec-4ca4-938c-1e0322c676b4-kube-api-access-qbwlb\") pod \"node-resolver-msl9r\" (UID: \"a3c5536c-62ec-4ca4-938c-1e0322c676b4\") " pod="openshift-dns/node-resolver-msl9r" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.373034 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.382717 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.382760 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.382770 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.382787 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.382800 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:57Z","lastTransitionTime":"2025-12-05T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.394519 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.408149 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.426091 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.440204 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.450319 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a3c5536c-62ec-4ca4-938c-1e0322c676b4-hosts-file\") pod \"node-resolver-msl9r\" (UID: \"a3c5536c-62ec-4ca4-938c-1e0322c676b4\") " pod="openshift-dns/node-resolver-msl9r" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.450382 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbwlb\" (UniqueName: \"kubernetes.io/projected/a3c5536c-62ec-4ca4-938c-1e0322c676b4-kube-api-access-qbwlb\") pod \"node-resolver-msl9r\" (UID: \"a3c5536c-62ec-4ca4-938c-1e0322c676b4\") " pod="openshift-dns/node-resolver-msl9r" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.450562 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a3c5536c-62ec-4ca4-938c-1e0322c676b4-hosts-file\") pod \"node-resolver-msl9r\" (UID: \"a3c5536c-62ec-4ca4-938c-1e0322c676b4\") " pod="openshift-dns/node-resolver-msl9r" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.454747 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.466479 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.476067 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbwlb\" (UniqueName: \"kubernetes.io/projected/a3c5536c-62ec-4ca4-938c-1e0322c676b4-kube-api-access-qbwlb\") pod \"node-resolver-msl9r\" (UID: \"a3c5536c-62ec-4ca4-938c-1e0322c676b4\") " pod="openshift-dns/node-resolver-msl9r" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.485402 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.485447 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.485460 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.485478 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.485490 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:57Z","lastTransitionTime":"2025-12-05T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.488902 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.514514 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.541913 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.570535 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.588550 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.588596 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.588607 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.588624 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.588636 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:57Z","lastTransitionTime":"2025-12-05T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.594699 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.610339 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.627972 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.642117 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-msl9r" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.690979 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.691033 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.691045 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.691060 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.691069 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:57Z","lastTransitionTime":"2025-12-05T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.742820 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-5m8lc"] Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.743265 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zmtwj"] Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.743455 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.743516 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.745232 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-c5qh5"] Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.746182 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.747166 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wx7m6"] Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.748181 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.752836 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.753672 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.753808 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.753852 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.754056 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 20:05:57 crc kubenswrapper[4885]: W1205 20:05:57.754304 4885 reflector.go:561] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 05 20:05:57 crc kubenswrapper[4885]: E1205 20:05:57.754348 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.754420 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.754480 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.754684 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.755009 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.755250 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.755447 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.755566 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.755709 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.755744 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6c25e90-efcc-490c-afef-970c3a62c809-cni-binary-copy\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.755767 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-etc-kubernetes\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.755798 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-multus-cni-dir\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.755812 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-run-netns\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.755838 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/21ee2046-c3c1-4501-abe5-0ac10ddfeaf1-rootfs\") pod \"machine-config-daemon-5m8lc\" (UID: \"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\") " pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.755858 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21ee2046-c3c1-4501-abe5-0ac10ddfeaf1-proxy-tls\") pod \"machine-config-daemon-5m8lc\" (UID: \"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\") " pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.755888 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-multus-socket-dir-parent\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.755904 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-run-k8s-cni-cncf-io\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.755960 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-hostroot\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756080 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-multus-conf-dir\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756112 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-run-multus-certs\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756114 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756166 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756250 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd7qn\" (UniqueName: \"kubernetes.io/projected/c6c25e90-efcc-490c-afef-970c3a62c809-kube-api-access-qd7qn\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756315 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21ee2046-c3c1-4501-abe5-0ac10ddfeaf1-mcd-auth-proxy-config\") pod \"machine-config-daemon-5m8lc\" (UID: \"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\") " pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756329 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756339 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-var-lib-cni-bin\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756367 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6c25e90-efcc-490c-afef-970c3a62c809-multus-daemon-config\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756388 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-cnibin\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756426 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-os-release\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756448 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-var-lib-kubelet\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756470 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-var-lib-cni-multus\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756526 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756674 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756525 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkflz\" (UniqueName: \"kubernetes.io/projected/21ee2046-c3c1-4501-abe5-0ac10ddfeaf1-kube-api-access-tkflz\") pod \"machine-config-daemon-5m8lc\" (UID: \"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\") " pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.756829 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-system-cni-dir\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.788825 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.798570 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.798608 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.798617 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.798631 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.798640 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:57Z","lastTransitionTime":"2025-12-05T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.809580 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.829925 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.851689 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858065 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-os-release\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858114 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-kubelet\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858140 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-system-cni-dir\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858158 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-var-lib-openvswitch\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858176 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-env-overrides\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858193 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b15623dc-71c3-4ee6-9078-3980cada3660-os-release\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858209 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b15623dc-71c3-4ee6-9078-3980cada3660-cni-binary-copy\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858245 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6c25e90-efcc-490c-afef-970c3a62c809-cni-binary-copy\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858260 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-run-netns\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858282 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-run-netns\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858297 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-openvswitch\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858314 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-log-socket\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858331 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b15623dc-71c3-4ee6-9078-3980cada3660-system-cni-dir\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858350 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b15623dc-71c3-4ee6-9078-3980cada3660-cnibin\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858368 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21ee2046-c3c1-4501-abe5-0ac10ddfeaf1-proxy-tls\") pod \"machine-config-daemon-5m8lc\" (UID: \"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\") " pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858387 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-run-k8s-cni-cncf-io\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858413 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd7qn\" (UniqueName: \"kubernetes.io/projected/c6c25e90-efcc-490c-afef-970c3a62c809-kube-api-access-qd7qn\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858435 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21ee2046-c3c1-4501-abe5-0ac10ddfeaf1-mcd-auth-proxy-config\") pod \"machine-config-daemon-5m8lc\" (UID: \"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\") " pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858451 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-var-lib-cni-bin\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858467 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6c25e90-efcc-490c-afef-970c3a62c809-multus-daemon-config\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858484 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-run-ovn-kubernetes\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858506 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-cni-netd\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858524 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-cnibin\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858540 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-var-lib-kubelet\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858556 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-systemd-units\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.858576 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-ovnkube-script-lib\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.859446 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkflz\" (UniqueName: \"kubernetes.io/projected/21ee2046-c3c1-4501-abe5-0ac10ddfeaf1-kube-api-access-tkflz\") pod \"machine-config-daemon-5m8lc\" (UID: \"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\") " pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.859488 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-var-lib-cni-multus\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.859516 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-ovnkube-config\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.859583 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-cnibin\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.859629 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-var-lib-kubelet\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.860000 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-var-lib-cni-multus\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.860044 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-var-lib-cni-bin\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.860286 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-os-release\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.860387 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-system-cni-dir\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.860528 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-slash\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.860626 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-etc-openvswitch\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.860657 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/86ae690a-3705-45ae-8816-da5f33d2105e-ovn-node-metrics-cert\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.860744 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-run-k8s-cni-cncf-io\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.860768 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6c25e90-efcc-490c-afef-970c3a62c809-multus-daemon-config\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.860779 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-run-netns\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.860820 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b15623dc-71c3-4ee6-9078-3980cada3660-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.860895 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-multus-cni-dir\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861056 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-etc-kubernetes\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861104 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-ovn\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861140 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-etc-kubernetes\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861230 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b15623dc-71c3-4ee6-9078-3980cada3660-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861260 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6c25e90-efcc-490c-afef-970c3a62c809-cni-binary-copy\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861387 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-multus-cni-dir\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861402 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-systemd\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861444 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-cni-bin\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861526 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/21ee2046-c3c1-4501-abe5-0ac10ddfeaf1-rootfs\") pod \"machine-config-daemon-5m8lc\" (UID: \"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\") " pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861565 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-multus-socket-dir-parent\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861592 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-hostroot\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861609 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-multus-conf-dir\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861631 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-run-multus-certs\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861664 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861688 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21ee2046-c3c1-4501-abe5-0ac10ddfeaf1-mcd-auth-proxy-config\") pod \"machine-config-daemon-5m8lc\" (UID: \"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\") " pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861708 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-node-log\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861663 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/21ee2046-c3c1-4501-abe5-0ac10ddfeaf1-rootfs\") pod \"machine-config-daemon-5m8lc\" (UID: \"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\") " pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861754 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dcsp\" (UniqueName: \"kubernetes.io/projected/86ae690a-3705-45ae-8816-da5f33d2105e-kube-api-access-8dcsp\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861754 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-multus-conf-dir\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861789 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75bfx\" (UniqueName: \"kubernetes.io/projected/b15623dc-71c3-4ee6-9078-3980cada3660-kube-api-access-75bfx\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861775 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-host-run-multus-certs\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861911 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-multus-socket-dir-parent\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.861965 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6c25e90-efcc-490c-afef-970c3a62c809-hostroot\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.863203 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21ee2046-c3c1-4501-abe5-0ac10ddfeaf1-proxy-tls\") pod \"machine-config-daemon-5m8lc\" (UID: \"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\") " pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.875999 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.879961 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd7qn\" (UniqueName: \"kubernetes.io/projected/c6c25e90-efcc-490c-afef-970c3a62c809-kube-api-access-qd7qn\") pod \"multus-zmtwj\" (UID: \"c6c25e90-efcc-490c-afef-970c3a62c809\") " pod="openshift-multus/multus-zmtwj" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.880887 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkflz\" (UniqueName: \"kubernetes.io/projected/21ee2046-c3c1-4501-abe5-0ac10ddfeaf1-kube-api-access-tkflz\") pod \"machine-config-daemon-5m8lc\" (UID: \"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\") " pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.893953 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.900546 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.900607 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.900618 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.900639 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.900655 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:57Z","lastTransitionTime":"2025-12-05T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.909347 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.926262 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.937803 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.947060 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.959214 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962632 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-run-ovn-kubernetes\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962667 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-cni-netd\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962692 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-systemd-units\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962701 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-run-ovn-kubernetes\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962717 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-ovnkube-script-lib\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962717 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-cni-netd\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962751 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-ovnkube-config\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962769 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-systemd-units\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962775 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-slash\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962795 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-etc-openvswitch\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962810 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/86ae690a-3705-45ae-8816-da5f33d2105e-ovn-node-metrics-cert\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962826 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b15623dc-71c3-4ee6-9078-3980cada3660-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962842 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-ovn\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962861 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b15623dc-71c3-4ee6-9078-3980cada3660-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962880 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-systemd\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962898 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-cni-bin\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962941 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962963 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-node-log\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.962982 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dcsp\" (UniqueName: \"kubernetes.io/projected/86ae690a-3705-45ae-8816-da5f33d2105e-kube-api-access-8dcsp\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963002 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75bfx\" (UniqueName: \"kubernetes.io/projected/b15623dc-71c3-4ee6-9078-3980cada3660-kube-api-access-75bfx\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963039 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-kubelet\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963061 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-var-lib-openvswitch\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963077 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-env-overrides\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963092 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b15623dc-71c3-4ee6-9078-3980cada3660-os-release\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963107 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b15623dc-71c3-4ee6-9078-3980cada3660-cni-binary-copy\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963124 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-run-netns\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963138 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-openvswitch\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963152 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-log-socket\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963165 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b15623dc-71c3-4ee6-9078-3980cada3660-system-cni-dir\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963181 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b15623dc-71c3-4ee6-9078-3980cada3660-cnibin\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963222 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b15623dc-71c3-4ee6-9078-3980cada3660-cnibin\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963247 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-slash\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963288 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-etc-openvswitch\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963575 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-ovnkube-config\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963620 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-ovnkube-script-lib\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963922 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-var-lib-openvswitch\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.963982 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-cni-bin\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.964006 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-openvswitch\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.964073 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.964160 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b15623dc-71c3-4ee6-9078-3980cada3660-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.964215 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-log-socket\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.964209 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-systemd\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.964217 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b15623dc-71c3-4ee6-9078-3980cada3660-os-release\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.964268 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-kubelet\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.964276 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b15623dc-71c3-4ee6-9078-3980cada3660-system-cni-dir\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.964293 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-run-netns\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.964235 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-node-log\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.964301 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-ovn\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.964742 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b15623dc-71c3-4ee6-9078-3980cada3660-cni-binary-copy\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.964922 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-env-overrides\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.964928 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b15623dc-71c3-4ee6-9078-3980cada3660-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.967180 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/86ae690a-3705-45ae-8816-da5f33d2105e-ovn-node-metrics-cert\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.971094 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.980592 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75bfx\" (UniqueName: \"kubernetes.io/projected/b15623dc-71c3-4ee6-9078-3980cada3660-kube-api-access-75bfx\") pod \"multus-additional-cni-plugins-c5qh5\" (UID: \"b15623dc-71c3-4ee6-9078-3980cada3660\") " pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:57 crc kubenswrapper[4885]: I1205 20:05:57.984730 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.002635 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.003779 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.003819 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.003829 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.003845 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.003858 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:58Z","lastTransitionTime":"2025-12-05T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.028121 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.058143 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.065120 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.074459 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zmtwj" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.086637 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.094487 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.108695 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.108733 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.108743 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.108759 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.108771 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:58Z","lastTransitionTime":"2025-12-05T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.113795 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.137762 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.159520 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.176581 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.188136 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.211301 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.211333 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.211341 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.211355 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.211365 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:58Z","lastTransitionTime":"2025-12-05T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.314510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.314949 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.314963 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.314982 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.314999 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:58Z","lastTransitionTime":"2025-12-05T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.323970 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.324074 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.324092 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"1eab9f94677949372ef466942b01c91700f71c8b7ddb90a8cd3c326781702be2"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.325555 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" event={"ID":"b15623dc-71c3-4ee6-9078-3980cada3660","Type":"ContainerStarted","Data":"96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.325625 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" event={"ID":"b15623dc-71c3-4ee6-9078-3980cada3660","Type":"ContainerStarted","Data":"3c845ac04bea16f3b69e764e06c3ae2978048a2fc567a43ea1cf7b3e1bbf6539"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.327395 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmtwj" event={"ID":"c6c25e90-efcc-490c-afef-970c3a62c809","Type":"ContainerStarted","Data":"245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.327428 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmtwj" event={"ID":"c6c25e90-efcc-490c-afef-970c3a62c809","Type":"ContainerStarted","Data":"096f042cc563dbbd21dd05792fadb3bd9a9dcefb746f7f26456f194a99ff034a"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.329928 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-msl9r" event={"ID":"a3c5536c-62ec-4ca4-938c-1e0322c676b4","Type":"ContainerStarted","Data":"d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.329993 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-msl9r" event={"ID":"a3c5536c-62ec-4ca4-938c-1e0322c676b4","Type":"ContainerStarted","Data":"4ca36e1b60578662a7cdd851119607264cc89714514c0e70998ed0d6c0be0075"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.338268 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.351623 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.369790 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.389107 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.410775 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.417797 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.417864 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.417875 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.417891 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.417904 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:58Z","lastTransitionTime":"2025-12-05T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.432645 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.448209 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.463603 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.481595 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.496057 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.506939 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.520798 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.520851 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.520866 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.520886 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.520898 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:58Z","lastTransitionTime":"2025-12-05T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.521262 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.532611 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.544466 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.559245 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.573679 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.587335 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.599645 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.614103 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.623954 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.623991 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.623999 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.624014 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.624040 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:58Z","lastTransitionTime":"2025-12-05T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.626855 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.637416 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.648695 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.658608 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.679652 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.721777 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.730030 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.730176 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.730367 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.730445 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.730522 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.730611 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:58Z","lastTransitionTime":"2025-12-05T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.732788 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.740162 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.753157 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.774198 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.786775 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.799599 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.803800 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.814487 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dcsp\" (UniqueName: \"kubernetes.io/projected/86ae690a-3705-45ae-8816-da5f33d2105e-kube-api-access-8dcsp\") pod \"ovnkube-node-wx7m6\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.815305 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.830068 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.832998 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.833117 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.833157 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.833186 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.833205 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:58Z","lastTransitionTime":"2025-12-05T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.843030 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.857235 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.871637 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.887064 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.897653 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.909077 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.922730 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.935614 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.935658 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.935668 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.935688 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.935700 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:58Z","lastTransitionTime":"2025-12-05T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.950363 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.990390 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:58 crc kubenswrapper[4885]: I1205 20:05:58.996650 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:05:59 crc kubenswrapper[4885]: W1205 20:05:59.014313 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86ae690a_3705_45ae_8816_da5f33d2105e.slice/crio-f0a478b9735a3f724cc2dc5edceee3817447922bda2d93fc34a194602825bfee WatchSource:0}: Error finding container f0a478b9735a3f724cc2dc5edceee3817447922bda2d93fc34a194602825bfee: Status 404 returned error can't find the container with id f0a478b9735a3f724cc2dc5edceee3817447922bda2d93fc34a194602825bfee Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.027945 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.039112 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.039374 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.039486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.039586 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.039674 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:59Z","lastTransitionTime":"2025-12-05T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.070870 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.110048 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.141946 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.141987 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.141996 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.142008 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.142035 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:59Z","lastTransitionTime":"2025-12-05T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.153655 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.172489 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.172511 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:05:59 crc kubenswrapper[4885]: E1205 20:05:59.172711 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.172515 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:05:59 crc kubenswrapper[4885]: E1205 20:05:59.172876 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:05:59 crc kubenswrapper[4885]: E1205 20:05:59.173070 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.194552 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.228105 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.244682 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.244717 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.244726 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.244739 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.244750 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:59Z","lastTransitionTime":"2025-12-05T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.269970 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.313146 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.337100 4885 generic.go:334] "Generic (PLEG): container finished" podID="b15623dc-71c3-4ee6-9078-3980cada3660" containerID="96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5" exitCode=0 Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.337233 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" event={"ID":"b15623dc-71c3-4ee6-9078-3980cada3660","Type":"ContainerDied","Data":"96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5"} Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.340286 4885 generic.go:334] "Generic (PLEG): container finished" podID="86ae690a-3705-45ae-8816-da5f33d2105e" containerID="9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186" exitCode=0 Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.340328 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerDied","Data":"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186"} Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.340360 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerStarted","Data":"f0a478b9735a3f724cc2dc5edceee3817447922bda2d93fc34a194602825bfee"} Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.351205 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.351253 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.351267 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.351285 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.351300 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:59Z","lastTransitionTime":"2025-12-05T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.357163 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: E1205 20:05:59.370004 4885 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.409465 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.449893 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.453952 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.453984 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.453995 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.454008 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.454016 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:59Z","lastTransitionTime":"2025-12-05T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.488870 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.528747 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.556762 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.556796 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.556806 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.556822 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.556833 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:59Z","lastTransitionTime":"2025-12-05T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.571735 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.615520 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.649913 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.659655 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.659697 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.659710 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.659726 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.659738 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:59Z","lastTransitionTime":"2025-12-05T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.688805 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.743763 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.762699 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.762746 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.762757 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.762777 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.762789 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:59Z","lastTransitionTime":"2025-12-05T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.781447 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.813454 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.853533 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.865615 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.865662 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.865674 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.865693 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.865706 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:59Z","lastTransitionTime":"2025-12-05T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.890766 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.969059 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.969126 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.969136 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.969157 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:05:59 crc kubenswrapper[4885]: I1205 20:05:59.969170 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:05:59Z","lastTransitionTime":"2025-12-05T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.000801 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-grvrw"] Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.001260 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-grvrw" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.003783 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.004051 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.005451 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.005500 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.022177 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.049478 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.072323 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.072355 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.072364 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.072381 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.072391 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:00Z","lastTransitionTime":"2025-12-05T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.087057 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38f80533-e916-4ba6-9c5d-86f0dbd6f521-host\") pod \"node-ca-grvrw\" (UID: \"38f80533-e916-4ba6-9c5d-86f0dbd6f521\") " pod="openshift-image-registry/node-ca-grvrw" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.087180 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zgr9\" (UniqueName: \"kubernetes.io/projected/38f80533-e916-4ba6-9c5d-86f0dbd6f521-kube-api-access-4zgr9\") pod \"node-ca-grvrw\" (UID: \"38f80533-e916-4ba6-9c5d-86f0dbd6f521\") " pod="openshift-image-registry/node-ca-grvrw" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.087206 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/38f80533-e916-4ba6-9c5d-86f0dbd6f521-serviceca\") pod \"node-ca-grvrw\" (UID: \"38f80533-e916-4ba6-9c5d-86f0dbd6f521\") " pod="openshift-image-registry/node-ca-grvrw" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.087594 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.129512 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.175813 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.175861 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.175871 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.175887 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.175897 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:00Z","lastTransitionTime":"2025-12-05T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.183434 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.187751 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38f80533-e916-4ba6-9c5d-86f0dbd6f521-host\") pod \"node-ca-grvrw\" (UID: \"38f80533-e916-4ba6-9c5d-86f0dbd6f521\") " pod="openshift-image-registry/node-ca-grvrw" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.187853 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zgr9\" (UniqueName: \"kubernetes.io/projected/38f80533-e916-4ba6-9c5d-86f0dbd6f521-kube-api-access-4zgr9\") pod \"node-ca-grvrw\" (UID: \"38f80533-e916-4ba6-9c5d-86f0dbd6f521\") " pod="openshift-image-registry/node-ca-grvrw" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.187885 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38f80533-e916-4ba6-9c5d-86f0dbd6f521-host\") pod \"node-ca-grvrw\" (UID: \"38f80533-e916-4ba6-9c5d-86f0dbd6f521\") " pod="openshift-image-registry/node-ca-grvrw" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.187893 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/38f80533-e916-4ba6-9c5d-86f0dbd6f521-serviceca\") pod \"node-ca-grvrw\" (UID: \"38f80533-e916-4ba6-9c5d-86f0dbd6f521\") " pod="openshift-image-registry/node-ca-grvrw" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.189597 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/38f80533-e916-4ba6-9c5d-86f0dbd6f521-serviceca\") pod \"node-ca-grvrw\" (UID: \"38f80533-e916-4ba6-9c5d-86f0dbd6f521\") " pod="openshift-image-registry/node-ca-grvrw" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.215617 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.245415 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zgr9\" (UniqueName: \"kubernetes.io/projected/38f80533-e916-4ba6-9c5d-86f0dbd6f521-kube-api-access-4zgr9\") pod \"node-ca-grvrw\" (UID: \"38f80533-e916-4ba6-9c5d-86f0dbd6f521\") " pod="openshift-image-registry/node-ca-grvrw" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.272503 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.278333 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.278389 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.278404 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.278427 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.278440 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:00Z","lastTransitionTime":"2025-12-05T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.311177 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.313294 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-grvrw" Dec 05 20:06:00 crc kubenswrapper[4885]: W1205 20:06:00.326711 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38f80533_e916_4ba6_9c5d_86f0dbd6f521.slice/crio-a157a2355a320fa0b4a9899506bd3815209c31b4f5a882c8bd7443c659f4b5b5 WatchSource:0}: Error finding container a157a2355a320fa0b4a9899506bd3815209c31b4f5a882c8bd7443c659f4b5b5: Status 404 returned error can't find the container with id a157a2355a320fa0b4a9899506bd3815209c31b4f5a882c8bd7443c659f4b5b5 Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.350061 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.365371 4885 generic.go:334] "Generic (PLEG): container finished" podID="b15623dc-71c3-4ee6-9078-3980cada3660" containerID="847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3" exitCode=0 Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.365566 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" event={"ID":"b15623dc-71c3-4ee6-9078-3980cada3660","Type":"ContainerDied","Data":"847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.384367 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.384657 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.384734 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.384813 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.384886 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:00Z","lastTransitionTime":"2025-12-05T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.386324 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerStarted","Data":"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.386376 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerStarted","Data":"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.386396 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerStarted","Data":"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.386415 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerStarted","Data":"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.386431 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerStarted","Data":"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.386449 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerStarted","Data":"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.388697 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-grvrw" event={"ID":"38f80533-e916-4ba6-9c5d-86f0dbd6f521","Type":"ContainerStarted","Data":"a157a2355a320fa0b4a9899506bd3815209c31b4f5a882c8bd7443c659f4b5b5"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.394633 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.430046 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.473358 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.489789 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.489829 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.489840 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.489861 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.489874 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:00Z","lastTransitionTime":"2025-12-05T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.514581 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.550040 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.593655 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.594065 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.594078 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.594096 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.594109 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:00Z","lastTransitionTime":"2025-12-05T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.595236 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.629671 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.668445 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.697118 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.697161 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.697173 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.697192 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.697204 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:00Z","lastTransitionTime":"2025-12-05T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.711489 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.753744 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.799975 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.800012 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.800043 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.800059 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.800069 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:00Z","lastTransitionTime":"2025-12-05T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.802566 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.830085 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.871185 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.902011 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.902090 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.902101 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.902124 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.902138 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:00Z","lastTransitionTime":"2025-12-05T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.910476 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.955755 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.986809 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.996138 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.996227 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.996249 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:00 crc kubenswrapper[4885]: E1205 20:06:00.996314 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:06:08.996282522 +0000 UTC m=+34.293098193 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:06:00 crc kubenswrapper[4885]: E1205 20:06:00.996371 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:06:00 crc kubenswrapper[4885]: E1205 20:06:00.996389 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:06:00 crc kubenswrapper[4885]: E1205 20:06:00.996399 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:06:00 crc kubenswrapper[4885]: E1205 20:06:00.996400 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:06:00 crc kubenswrapper[4885]: E1205 20:06:00.996438 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:06:08.996424486 +0000 UTC m=+34.293240147 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:06:00 crc kubenswrapper[4885]: E1205 20:06:00.996497 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:06:08.996474888 +0000 UTC m=+34.293290619 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:06:00 crc kubenswrapper[4885]: E1205 20:06:00.996498 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:06:00 crc kubenswrapper[4885]: E1205 20:06:00.996562 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:06:00 crc kubenswrapper[4885]: E1205 20:06:00.996580 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.996407 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:00 crc kubenswrapper[4885]: E1205 20:06:00.996628 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:06:08.996614822 +0000 UTC m=+34.293430563 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:06:00 crc kubenswrapper[4885]: I1205 20:06:00.996671 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:00 crc kubenswrapper[4885]: E1205 20:06:00.996748 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:06:00 crc kubenswrapper[4885]: E1205 20:06:00.996784 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:06:08.996776078 +0000 UTC m=+34.293591739 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.004453 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.004488 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.004501 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.004517 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.004529 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:01Z","lastTransitionTime":"2025-12-05T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.034925 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.069515 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.107296 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.107356 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.107389 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.107419 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.107435 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:01Z","lastTransitionTime":"2025-12-05T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.113363 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.172291 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.172301 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.172506 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:01 crc kubenswrapper[4885]: E1205 20:06:01.172713 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:01 crc kubenswrapper[4885]: E1205 20:06:01.172936 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:01 crc kubenswrapper[4885]: E1205 20:06:01.173166 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.210534 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.210577 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.210588 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.210607 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.210623 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:01Z","lastTransitionTime":"2025-12-05T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.314584 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.314648 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.314668 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.314692 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.314710 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:01Z","lastTransitionTime":"2025-12-05T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.394281 4885 generic.go:334] "Generic (PLEG): container finished" podID="b15623dc-71c3-4ee6-9078-3980cada3660" containerID="abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe" exitCode=0 Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.394384 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" event={"ID":"b15623dc-71c3-4ee6-9078-3980cada3660","Type":"ContainerDied","Data":"abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe"} Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.396598 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-grvrw" event={"ID":"38f80533-e916-4ba6-9c5d-86f0dbd6f521","Type":"ContainerStarted","Data":"c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef"} Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.418015 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.418094 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.418113 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.418136 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.418152 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:01Z","lastTransitionTime":"2025-12-05T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.423008 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.440253 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.453385 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.467742 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.479919 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.498274 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.513578 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.522297 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.522456 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.522580 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.522687 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.522780 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:01Z","lastTransitionTime":"2025-12-05T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.527971 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.540131 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.553762 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.571045 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.589659 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.625292 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.625337 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.625350 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.625371 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.625385 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:01Z","lastTransitionTime":"2025-12-05T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.628187 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.675973 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.713351 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.727798 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.727855 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.727878 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.727900 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.727915 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:01Z","lastTransitionTime":"2025-12-05T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.752644 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.797343 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.831243 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.831296 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.831308 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.831330 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.831344 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:01Z","lastTransitionTime":"2025-12-05T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.833728 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.869353 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.913437 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.934186 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.934239 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.934258 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.934280 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.934296 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:01Z","lastTransitionTime":"2025-12-05T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.947128 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:01 crc kubenswrapper[4885]: I1205 20:06:01.993395 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:01Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.034201 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.036738 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.036792 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.036811 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.036837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.036856 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:02Z","lastTransitionTime":"2025-12-05T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.072096 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.116398 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.139805 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.139875 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.139893 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.139921 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.139938 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:02Z","lastTransitionTime":"2025-12-05T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.149992 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.192892 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.232460 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.242731 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.242789 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.242802 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.242826 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.242841 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:02Z","lastTransitionTime":"2025-12-05T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.345507 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.345591 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.345610 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.345631 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.345645 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:02Z","lastTransitionTime":"2025-12-05T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.404272 4885 generic.go:334] "Generic (PLEG): container finished" podID="b15623dc-71c3-4ee6-9078-3980cada3660" containerID="f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217" exitCode=0 Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.404342 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" event={"ID":"b15623dc-71c3-4ee6-9078-3980cada3660","Type":"ContainerDied","Data":"f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217"} Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.411634 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerStarted","Data":"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5"} Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.429538 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.451498 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.451563 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.451579 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.451609 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.451626 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:02Z","lastTransitionTime":"2025-12-05T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.454145 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.487040 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.528785 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.542278 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.553480 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.554912 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.554947 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.554960 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.554982 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.554996 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:02Z","lastTransitionTime":"2025-12-05T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.567970 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.588805 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.603998 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.628877 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.658181 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.658376 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.658484 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.658585 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.658689 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:02Z","lastTransitionTime":"2025-12-05T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.670234 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.708805 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.750604 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.760798 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.760840 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.760851 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.760865 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.760876 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:02Z","lastTransitionTime":"2025-12-05T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.792665 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.864274 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.864340 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.864363 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.864390 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.864414 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:02Z","lastTransitionTime":"2025-12-05T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.966913 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.967011 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.967034 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.967054 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:02 crc kubenswrapper[4885]: I1205 20:06:02.967088 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:02Z","lastTransitionTime":"2025-12-05T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.071687 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.071734 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.071745 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.071763 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.071777 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:03Z","lastTransitionTime":"2025-12-05T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.172434 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.172522 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.172565 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:03 crc kubenswrapper[4885]: E1205 20:06:03.172585 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:03 crc kubenswrapper[4885]: E1205 20:06:03.172716 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:03 crc kubenswrapper[4885]: E1205 20:06:03.172849 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.175212 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.175245 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.175258 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.175276 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.175289 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:03Z","lastTransitionTime":"2025-12-05T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.277675 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.277706 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.277714 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.277729 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.277737 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:03Z","lastTransitionTime":"2025-12-05T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.381014 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.381107 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.381120 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.381143 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.381160 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:03Z","lastTransitionTime":"2025-12-05T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.420672 4885 generic.go:334] "Generic (PLEG): container finished" podID="b15623dc-71c3-4ee6-9078-3980cada3660" containerID="9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9" exitCode=0 Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.420755 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" event={"ID":"b15623dc-71c3-4ee6-9078-3980cada3660","Type":"ContainerDied","Data":"9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9"} Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.445859 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.467430 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.484318 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.484369 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.484383 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.484403 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.484417 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:03Z","lastTransitionTime":"2025-12-05T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.500847 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.523233 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.538914 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.555617 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.568554 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.584993 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.587920 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.587950 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.587964 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.587982 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.587993 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:03Z","lastTransitionTime":"2025-12-05T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.602791 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.617638 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.632331 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.648552 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.664178 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.677281 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.690772 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.690840 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.690867 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.690903 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.690932 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:03Z","lastTransitionTime":"2025-12-05T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.794595 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.794654 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.794664 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.794686 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.794700 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:03Z","lastTransitionTime":"2025-12-05T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.897632 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.897684 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.897698 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.897716 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:03 crc kubenswrapper[4885]: I1205 20:06:03.897729 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:03Z","lastTransitionTime":"2025-12-05T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.001223 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.001301 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.001319 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.001348 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.001366 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:04Z","lastTransitionTime":"2025-12-05T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.104307 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.104409 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.104420 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.104440 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.104455 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:04Z","lastTransitionTime":"2025-12-05T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.208637 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.209278 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.209303 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.209333 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.209352 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:04Z","lastTransitionTime":"2025-12-05T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.312850 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.312892 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.312901 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.312918 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.312930 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:04Z","lastTransitionTime":"2025-12-05T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.416161 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.416230 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.416245 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.416647 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.416686 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:04Z","lastTransitionTime":"2025-12-05T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.428778 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerStarted","Data":"b31be0bc3e06ee7097b020cb25b93f87589ea304cc215230d9b1c139857dc178"} Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.429306 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.434457 4885 generic.go:334] "Generic (PLEG): container finished" podID="b15623dc-71c3-4ee6-9078-3980cada3660" containerID="c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7" exitCode=0 Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.434537 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" event={"ID":"b15623dc-71c3-4ee6-9078-3980cada3660","Type":"ContainerDied","Data":"c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7"} Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.447223 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.455277 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.464708 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.479780 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.496771 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.515844 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.522179 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.522228 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.522248 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.522265 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.522277 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:04Z","lastTransitionTime":"2025-12-05T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.541278 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b31be0bc3e06ee7097b020cb25b93f87589ea304cc215230d9b1c139857dc178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.553990 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.568868 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.581923 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.595749 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.611593 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.624104 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.624152 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.624164 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.624181 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.624191 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:04Z","lastTransitionTime":"2025-12-05T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.625849 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.639078 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.651907 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.663033 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.673967 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.689519 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.698529 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.711052 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.729924 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.729961 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.729974 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.729992 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.730006 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:04Z","lastTransitionTime":"2025-12-05T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.730038 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.743404 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.759634 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.775346 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.788217 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.808112 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.824288 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.831948 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.831987 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.831997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.832013 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.832044 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:04Z","lastTransitionTime":"2025-12-05T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.842570 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.868736 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b31be0bc3e06ee7097b020cb25b93f87589ea304cc215230d9b1c139857dc178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.935944 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.936069 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.936092 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.936118 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:04 crc kubenswrapper[4885]: I1205 20:06:04.936141 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:04Z","lastTransitionTime":"2025-12-05T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.039201 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.039278 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.039303 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.039332 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.039357 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:05Z","lastTransitionTime":"2025-12-05T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.142848 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.142900 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.142914 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.142933 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.142948 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:05Z","lastTransitionTime":"2025-12-05T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.172412 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:05 crc kubenswrapper[4885]: E1205 20:06:05.172573 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.172441 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.172830 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:05 crc kubenswrapper[4885]: E1205 20:06:05.173135 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:05 crc kubenswrapper[4885]: E1205 20:06:05.173246 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.190841 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.209414 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.230000 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.245413 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.245464 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.245476 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.245515 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.245528 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:05Z","lastTransitionTime":"2025-12-05T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.255553 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.276256 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.289502 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.306214 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.320797 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.342190 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b31be0bc3e06ee7097b020cb25b93f87589ea304cc215230d9b1c139857dc178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.347481 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.347516 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.347526 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.347543 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.347556 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:05Z","lastTransitionTime":"2025-12-05T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.356184 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.372200 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.382732 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.402804 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.418901 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.443130 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" event={"ID":"b15623dc-71c3-4ee6-9078-3980cada3660","Type":"ContainerStarted","Data":"bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd"} Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.443794 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.444048 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.449147 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.449186 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.449200 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.449216 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.449263 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:05Z","lastTransitionTime":"2025-12-05T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.464632 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.473730 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.478718 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.495379 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.508398 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.523989 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.538174 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.551110 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.551143 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.551154 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.551171 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.551184 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:05Z","lastTransitionTime":"2025-12-05T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.552287 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.567451 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.579722 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.595059 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.606878 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.618503 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.633651 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.653954 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.654047 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.654074 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.654105 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.654127 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:05Z","lastTransitionTime":"2025-12-05T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.661399 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b31be0bc3e06ee7097b020cb25b93f87589ea304cc215230d9b1c139857dc178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.678545 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.696638 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.710811 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.750152 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.757357 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.757516 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.757632 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.757740 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.757825 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:05Z","lastTransitionTime":"2025-12-05T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.788997 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.832004 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.860660 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.860737 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.860753 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.860778 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.860796 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:05Z","lastTransitionTime":"2025-12-05T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.873592 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.917088 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.957909 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.963891 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.963940 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.963962 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.963986 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.964003 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:05Z","lastTransitionTime":"2025-12-05T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:05 crc kubenswrapper[4885]: I1205 20:06:05.993646 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.031741 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.067350 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.067413 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.067423 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.067442 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.067454 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:06Z","lastTransitionTime":"2025-12-05T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.080201 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.114428 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.165863 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b31be0bc3e06ee7097b020cb25b93f87589ea304cc215230d9b1c139857dc178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.170766 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.170824 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.170842 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.170868 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.170887 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:06Z","lastTransitionTime":"2025-12-05T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.274708 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.274767 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.274778 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.274800 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.274813 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:06Z","lastTransitionTime":"2025-12-05T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.378048 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.378104 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.378116 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.378136 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.378148 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:06Z","lastTransitionTime":"2025-12-05T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.409142 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.409204 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.409219 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.409238 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.409252 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:06Z","lastTransitionTime":"2025-12-05T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:06 crc kubenswrapper[4885]: E1205 20:06:06.425910 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.430716 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.430770 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.430781 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.430802 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.430815 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:06Z","lastTransitionTime":"2025-12-05T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:06 crc kubenswrapper[4885]: E1205 20:06:06.452406 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.456484 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.456522 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.456535 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.456551 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.456565 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:06Z","lastTransitionTime":"2025-12-05T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:06 crc kubenswrapper[4885]: E1205 20:06:06.471493 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.475571 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.475622 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.475632 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.475651 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.475662 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:06Z","lastTransitionTime":"2025-12-05T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:06 crc kubenswrapper[4885]: E1205 20:06:06.489488 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.493250 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.493311 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.493325 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.493344 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.493360 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:06Z","lastTransitionTime":"2025-12-05T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:06 crc kubenswrapper[4885]: E1205 20:06:06.506655 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:06 crc kubenswrapper[4885]: E1205 20:06:06.506892 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.508806 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.508857 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.508875 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.508896 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.508911 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:06Z","lastTransitionTime":"2025-12-05T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.610749 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.610796 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.610807 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.610824 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.610835 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:06Z","lastTransitionTime":"2025-12-05T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.713695 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.713759 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.713773 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.713789 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.713801 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:06Z","lastTransitionTime":"2025-12-05T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.816793 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.816858 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.816877 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.816953 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.816979 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:06Z","lastTransitionTime":"2025-12-05T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.919158 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.919390 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.919398 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.919411 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:06 crc kubenswrapper[4885]: I1205 20:06:06.919419 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:06Z","lastTransitionTime":"2025-12-05T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.073627 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.073659 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.073667 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.073682 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.073691 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:07Z","lastTransitionTime":"2025-12-05T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.171770 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.171853 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.171891 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:07 crc kubenswrapper[4885]: E1205 20:06:07.171960 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:07 crc kubenswrapper[4885]: E1205 20:06:07.172120 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:07 crc kubenswrapper[4885]: E1205 20:06:07.172259 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.176837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.176893 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.176909 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.176937 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.176954 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:07Z","lastTransitionTime":"2025-12-05T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.280421 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.280487 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.280502 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.280527 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.280545 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:07Z","lastTransitionTime":"2025-12-05T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.390752 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.390812 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.390824 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.390843 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.390855 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:07Z","lastTransitionTime":"2025-12-05T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.453558 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovnkube-controller/0.log" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.457985 4885 generic.go:334] "Generic (PLEG): container finished" podID="86ae690a-3705-45ae-8816-da5f33d2105e" containerID="b31be0bc3e06ee7097b020cb25b93f87589ea304cc215230d9b1c139857dc178" exitCode=1 Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.458075 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerDied","Data":"b31be0bc3e06ee7097b020cb25b93f87589ea304cc215230d9b1c139857dc178"} Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.459260 4885 scope.go:117] "RemoveContainer" containerID="b31be0bc3e06ee7097b020cb25b93f87589ea304cc215230d9b1c139857dc178" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.481765 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.495290 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.495377 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.495404 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.495437 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.495461 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:07Z","lastTransitionTime":"2025-12-05T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.499517 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.520229 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.546890 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.563264 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.581643 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.598567 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.598619 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.598631 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.598648 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.598659 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:07Z","lastTransitionTime":"2025-12-05T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.599853 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.616003 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.634704 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.650282 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.667235 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.702375 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.718586 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.718652 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.718669 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.718692 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.718709 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:07Z","lastTransitionTime":"2025-12-05T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.735252 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.767581 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b31be0bc3e06ee7097b020cb25b93f87589ea304cc215230d9b1c139857dc178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b31be0bc3e06ee7097b020cb25b93f87589ea304cc215230d9b1c139857dc178\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:07Z\\\",\\\"message\\\":\\\"espace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:06:07.009591 6166 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:06:07.009952 6166 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:06:07.010087 6166 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:06:07.010150 6166 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 20:06:07.010178 6166 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 20:06:07.010114 6166 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:06:07.010221 6166 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:06:07.010247 6166 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:06:07.010272 6166 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:06:07.010249 6166 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 20:06:07.010306 6166 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:06:07.010327 6166 factory.go:656] Stopping watch factory\\\\nI1205 20:06:07.010332 6166 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 20:06:07.010382 6166 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:07Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.821774 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.821839 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.821854 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.821880 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.821896 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:07Z","lastTransitionTime":"2025-12-05T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.924369 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.924414 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.924427 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.924444 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:07 crc kubenswrapper[4885]: I1205 20:06:07.924460 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:07Z","lastTransitionTime":"2025-12-05T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.027444 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.027489 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.027499 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.027519 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.027530 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:08Z","lastTransitionTime":"2025-12-05T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.130636 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.130689 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.130708 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.130733 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.130746 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:08Z","lastTransitionTime":"2025-12-05T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.232992 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.233075 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.233104 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.233131 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.233146 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:08Z","lastTransitionTime":"2025-12-05T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.336169 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.336228 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.336247 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.336270 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.336282 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:08Z","lastTransitionTime":"2025-12-05T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.438438 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.438478 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.438487 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.438503 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.438513 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:08Z","lastTransitionTime":"2025-12-05T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.463817 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovnkube-controller/0.log" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.466587 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerStarted","Data":"3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0"} Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.467075 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.480766 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.492166 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.511899 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.525763 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.538631 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.541198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.541256 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.541279 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.541308 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.541334 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:08Z","lastTransitionTime":"2025-12-05T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.555456 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.573430 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.592792 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.622926 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b31be0bc3e06ee7097b020cb25b93f87589ea304cc215230d9b1c139857dc178\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:07Z\\\",\\\"message\\\":\\\"espace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:06:07.009591 6166 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:06:07.009952 6166 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:06:07.010087 6166 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:06:07.010150 6166 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 20:06:07.010178 6166 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 20:06:07.010114 6166 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:06:07.010221 6166 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:06:07.010247 6166 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:06:07.010272 6166 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:06:07.010249 6166 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 20:06:07.010306 6166 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:06:07.010327 6166 factory.go:656] Stopping watch factory\\\\nI1205 20:06:07.010332 6166 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 20:06:07.010382 6166 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.644843 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.644928 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.644948 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.644973 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.644991 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:08Z","lastTransitionTime":"2025-12-05T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.646314 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.668518 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.692980 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.706861 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.725981 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.748188 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.748261 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.748286 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.748320 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.748343 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:08Z","lastTransitionTime":"2025-12-05T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.851343 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.851422 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.851464 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.851515 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.851540 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:08Z","lastTransitionTime":"2025-12-05T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.955178 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.955247 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.955264 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.955286 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:08 crc kubenswrapper[4885]: I1205 20:06:08.955303 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:08Z","lastTransitionTime":"2025-12-05T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.058926 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.058998 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.059067 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.059102 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.059126 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:09Z","lastTransitionTime":"2025-12-05T20:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.084422 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.084560 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:06:25.084525621 +0000 UTC m=+50.381341322 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.084617 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.084698 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.084735 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.084776 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.084921 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.084985 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:06:25.084969505 +0000 UTC m=+50.381785206 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.085172 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.085202 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.085220 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.085259 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.085290 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:06:25.085270774 +0000 UTC m=+50.382086465 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.085340 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:06:25.085320066 +0000 UTC m=+50.382135727 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.085420 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.086109 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.086138 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.086231 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:06:25.086206263 +0000 UTC m=+50.383022014 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.162663 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.162725 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.162739 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.162761 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.162779 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:09Z","lastTransitionTime":"2025-12-05T20:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.172093 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.172243 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.172343 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.172429 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.172617 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.172780 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.265425 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.265486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.265495 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.265515 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.265527 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:09Z","lastTransitionTime":"2025-12-05T20:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.368511 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.368566 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.368584 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.368609 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.368628 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:09Z","lastTransitionTime":"2025-12-05T20:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.470344 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.470383 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.470392 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.470406 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.470416 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:09Z","lastTransitionTime":"2025-12-05T20:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.472423 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovnkube-controller/1.log" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.473149 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovnkube-controller/0.log" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.476513 4885 generic.go:334] "Generic (PLEG): container finished" podID="86ae690a-3705-45ae-8816-da5f33d2105e" containerID="3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0" exitCode=1 Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.476557 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerDied","Data":"3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0"} Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.476603 4885 scope.go:117] "RemoveContainer" containerID="b31be0bc3e06ee7097b020cb25b93f87589ea304cc215230d9b1c139857dc178" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.477241 4885 scope.go:117] "RemoveContainer" containerID="3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0" Dec 05 20:06:09 crc kubenswrapper[4885]: E1205 20:06:09.477484 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.499744 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.515644 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.527342 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.544070 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.557114 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.573061 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.573096 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.573108 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.573125 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.573138 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:09Z","lastTransitionTime":"2025-12-05T20:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.586931 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b31be0bc3e06ee7097b020cb25b93f87589ea304cc215230d9b1c139857dc178\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:07Z\\\",\\\"message\\\":\\\"espace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 20:06:07.009591 6166 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:06:07.009952 6166 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:06:07.010087 6166 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:06:07.010150 6166 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 20:06:07.010178 6166 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 20:06:07.010114 6166 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:06:07.010221 6166 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:06:07.010247 6166 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:06:07.010272 6166 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:06:07.010249 6166 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 20:06:07.010306 6166 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:06:07.010327 6166 factory.go:656] Stopping watch factory\\\\nI1205 20:06:07.010332 6166 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 20:06:07.010382 6166 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:09Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 20:06:08.368301 6307 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:06:08.368379 6307 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:06:08.368396 6307 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:06:08.368452 6307 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:06:08.368489 6307 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:06:08.368504 6307 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:06:08.368578 6307 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:06:08.368611 6307 factory.go:656] Stopping watch factory\\\\nI1205 20:06:08.368632 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:06:08.368667 6307 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:06:08.368687 6307 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:06:08.368702 6307 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:06:08.368714 6307 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:06:08.368725 6307 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:06:08.368740 6307 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:06:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.598005 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.611142 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.621915 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.630563 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.646175 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.657968 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.674781 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.675510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.675570 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.675583 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.675996 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.676048 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:09Z","lastTransitionTime":"2025-12-05T20:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.691942 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.779231 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.779275 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.779292 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.779313 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.779327 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:09Z","lastTransitionTime":"2025-12-05T20:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.882411 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.882478 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.882502 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.882530 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.882552 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:09Z","lastTransitionTime":"2025-12-05T20:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.984904 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.984979 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.984992 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.985007 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:09 crc kubenswrapper[4885]: I1205 20:06:09.985033 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:09Z","lastTransitionTime":"2025-12-05T20:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.087975 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.088040 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.088051 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.088068 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.088081 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:10Z","lastTransitionTime":"2025-12-05T20:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.190899 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.190960 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.190978 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.191006 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.191063 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:10Z","lastTransitionTime":"2025-12-05T20:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.293699 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.293757 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.293775 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.293797 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.293814 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:10Z","lastTransitionTime":"2025-12-05T20:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.396220 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.396279 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.396297 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.396321 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.396339 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:10Z","lastTransitionTime":"2025-12-05T20:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.482118 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovnkube-controller/1.log" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.486991 4885 scope.go:117] "RemoveContainer" containerID="3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0" Dec 05 20:06:10 crc kubenswrapper[4885]: E1205 20:06:10.487367 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.498947 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.499005 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.499060 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.499090 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.499109 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:10Z","lastTransitionTime":"2025-12-05T20:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.508714 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.527470 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.553881 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl"] Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.554726 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.557498 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.558477 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.560138 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:09Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 20:06:08.368301 6307 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:06:08.368379 6307 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:06:08.368396 6307 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:06:08.368452 6307 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:06:08.368489 6307 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:06:08.368504 6307 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:06:08.368578 6307 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:06:08.368611 6307 factory.go:656] Stopping watch factory\\\\nI1205 20:06:08.368632 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:06:08.368667 6307 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:06:08.368687 6307 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:06:08.368702 6307 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:06:08.368714 6307 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:06:08.368725 6307 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:06:08.368740 6307 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:06:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.583612 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.601080 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f8bd00a1-3879-4791-8e78-150f2a0bf522-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6hhxl\" (UID: \"f8bd00a1-3879-4791-8e78-150f2a0bf522\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.601177 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr9g6\" (UniqueName: \"kubernetes.io/projected/f8bd00a1-3879-4791-8e78-150f2a0bf522-kube-api-access-sr9g6\") pod \"ovnkube-control-plane-749d76644c-6hhxl\" (UID: \"f8bd00a1-3879-4791-8e78-150f2a0bf522\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.601254 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f8bd00a1-3879-4791-8e78-150f2a0bf522-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6hhxl\" (UID: \"f8bd00a1-3879-4791-8e78-150f2a0bf522\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.601286 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f8bd00a1-3879-4791-8e78-150f2a0bf522-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6hhxl\" (UID: \"f8bd00a1-3879-4791-8e78-150f2a0bf522\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.601786 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.602703 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.602778 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.602801 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.602834 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.602860 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:10Z","lastTransitionTime":"2025-12-05T20:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.625791 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.641846 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.665398 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.684126 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.702149 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f8bd00a1-3879-4791-8e78-150f2a0bf522-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6hhxl\" (UID: \"f8bd00a1-3879-4791-8e78-150f2a0bf522\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.702289 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr9g6\" (UniqueName: \"kubernetes.io/projected/f8bd00a1-3879-4791-8e78-150f2a0bf522-kube-api-access-sr9g6\") pod \"ovnkube-control-plane-749d76644c-6hhxl\" (UID: \"f8bd00a1-3879-4791-8e78-150f2a0bf522\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.702357 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f8bd00a1-3879-4791-8e78-150f2a0bf522-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6hhxl\" (UID: \"f8bd00a1-3879-4791-8e78-150f2a0bf522\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.702412 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f8bd00a1-3879-4791-8e78-150f2a0bf522-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6hhxl\" (UID: \"f8bd00a1-3879-4791-8e78-150f2a0bf522\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.703476 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f8bd00a1-3879-4791-8e78-150f2a0bf522-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6hhxl\" (UID: \"f8bd00a1-3879-4791-8e78-150f2a0bf522\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.703734 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f8bd00a1-3879-4791-8e78-150f2a0bf522-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6hhxl\" (UID: \"f8bd00a1-3879-4791-8e78-150f2a0bf522\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.706182 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.706231 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.706247 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.706271 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.706289 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:10Z","lastTransitionTime":"2025-12-05T20:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.706320 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.713517 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f8bd00a1-3879-4791-8e78-150f2a0bf522-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6hhxl\" (UID: \"f8bd00a1-3879-4791-8e78-150f2a0bf522\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.729277 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.734679 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr9g6\" (UniqueName: \"kubernetes.io/projected/f8bd00a1-3879-4791-8e78-150f2a0bf522-kube-api-access-sr9g6\") pod \"ovnkube-control-plane-749d76644c-6hhxl\" (UID: \"f8bd00a1-3879-4791-8e78-150f2a0bf522\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.751109 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.773600 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.793924 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.810009 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.810299 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.810345 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.810382 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.810408 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:10Z","lastTransitionTime":"2025-12-05T20:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.811188 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.827354 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.841536 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.852839 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.855599 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.865284 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.877935 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.879169 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.895188 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.913066 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.913137 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.913147 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.913161 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.913170 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:10Z","lastTransitionTime":"2025-12-05T20:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.917003 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.936551 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.957127 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.972377 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:10 crc kubenswrapper[4885]: I1205 20:06:10.988210 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.000763 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.012249 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.016180 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.016221 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.016235 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.016252 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.016264 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:11Z","lastTransitionTime":"2025-12-05T20:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.036713 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:09Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 20:06:08.368301 6307 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:06:08.368379 6307 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:06:08.368396 6307 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:06:08.368452 6307 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:06:08.368489 6307 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:06:08.368504 6307 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:06:08.368578 6307 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:06:08.368611 6307 factory.go:656] Stopping watch factory\\\\nI1205 20:06:08.368632 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:06:08.368667 6307 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:06:08.368687 6307 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:06:08.368702 6307 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:06:08.368714 6307 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:06:08.368725 6307 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:06:08.368740 6307 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:06:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.055152 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:09Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 20:06:08.368301 6307 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:06:08.368379 6307 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:06:08.368396 6307 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:06:08.368452 6307 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:06:08.368489 6307 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:06:08.368504 6307 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:06:08.368578 6307 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:06:08.368611 6307 factory.go:656] Stopping watch factory\\\\nI1205 20:06:08.368632 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:06:08.368667 6307 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:06:08.368687 6307 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:06:08.368702 6307 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:06:08.368714 6307 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:06:08.368725 6307 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:06:08.368740 6307 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:06:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.065538 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.079103 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.095530 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.108779 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.118477 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.118526 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.118542 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.118564 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.118579 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:11Z","lastTransitionTime":"2025-12-05T20:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.125294 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.140063 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.151078 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.166248 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.171799 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.171855 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:11 crc kubenswrapper[4885]: E1205 20:06:11.171918 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:11 crc kubenswrapper[4885]: E1205 20:06:11.172002 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.172139 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:11 crc kubenswrapper[4885]: E1205 20:06:11.172391 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.181215 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.194706 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.207092 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.221839 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.221884 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.221895 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.221913 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.221926 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:11Z","lastTransitionTime":"2025-12-05T20:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.224577 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.241260 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.253425 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.325236 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.325290 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.325301 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.325321 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.325336 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:11Z","lastTransitionTime":"2025-12-05T20:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.428340 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.428386 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.428398 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.428415 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.428427 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:11Z","lastTransitionTime":"2025-12-05T20:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.491467 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" event={"ID":"f8bd00a1-3879-4791-8e78-150f2a0bf522","Type":"ContainerStarted","Data":"25b7b24568fb2d28cbf6859c70c83ae12995e35f485e1bad917d24f8fa4fb9db"} Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.532786 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.532857 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.532879 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.532910 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.532943 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:11Z","lastTransitionTime":"2025-12-05T20:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.637785 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.637835 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.637853 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.637875 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.637895 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:11Z","lastTransitionTime":"2025-12-05T20:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.687898 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-2jdj4"] Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.688754 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:11 crc kubenswrapper[4885]: E1205 20:06:11.688851 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.713465 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.734095 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.740350 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.740416 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.740441 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.740469 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.740487 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:11Z","lastTransitionTime":"2025-12-05T20:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.753430 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.776381 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.793636 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.814344 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.815102 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pf8f\" (UniqueName: \"kubernetes.io/projected/a5c0a952-e24a-49c2-b4ba-e20be61b840d-kube-api-access-4pf8f\") pod \"network-metrics-daemon-2jdj4\" (UID: \"a5c0a952-e24a-49c2-b4ba-e20be61b840d\") " pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.815160 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs\") pod \"network-metrics-daemon-2jdj4\" (UID: \"a5c0a952-e24a-49c2-b4ba-e20be61b840d\") " pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.835556 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.843808 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.843879 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.843897 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.844397 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.844459 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:11Z","lastTransitionTime":"2025-12-05T20:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.860721 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.879508 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.897115 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.915724 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.915821 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pf8f\" (UniqueName: \"kubernetes.io/projected/a5c0a952-e24a-49c2-b4ba-e20be61b840d-kube-api-access-4pf8f\") pod \"network-metrics-daemon-2jdj4\" (UID: \"a5c0a952-e24a-49c2-b4ba-e20be61b840d\") " pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.915946 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs\") pod \"network-metrics-daemon-2jdj4\" (UID: \"a5c0a952-e24a-49c2-b4ba-e20be61b840d\") " pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:11 crc kubenswrapper[4885]: E1205 20:06:11.916151 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:06:11 crc kubenswrapper[4885]: E1205 20:06:11.916250 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs podName:a5c0a952-e24a-49c2-b4ba-e20be61b840d nodeName:}" failed. No retries permitted until 2025-12-05 20:06:12.41622475 +0000 UTC m=+37.713040451 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs") pod "network-metrics-daemon-2jdj4" (UID: "a5c0a952-e24a-49c2-b4ba-e20be61b840d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.927172 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.931312 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pf8f\" (UniqueName: \"kubernetes.io/projected/a5c0a952-e24a-49c2-b4ba-e20be61b840d-kube-api-access-4pf8f\") pod \"network-metrics-daemon-2jdj4\" (UID: \"a5c0a952-e24a-49c2-b4ba-e20be61b840d\") " pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.937058 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.946114 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.946153 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.946164 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.946180 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.946192 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:11Z","lastTransitionTime":"2025-12-05T20:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.948630 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.962746 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:11 crc kubenswrapper[4885]: I1205 20:06:11.980894 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:09Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 20:06:08.368301 6307 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:06:08.368379 6307 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:06:08.368396 6307 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:06:08.368452 6307 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:06:08.368489 6307 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:06:08.368504 6307 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:06:08.368578 6307 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:06:08.368611 6307 factory.go:656] Stopping watch factory\\\\nI1205 20:06:08.368632 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:06:08.368667 6307 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:06:08.368687 6307 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:06:08.368702 6307 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:06:08.368714 6307 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:06:08.368725 6307 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:06:08.368740 6307 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:06:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:11Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.049104 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.049160 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.049171 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.049188 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.049203 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:12Z","lastTransitionTime":"2025-12-05T20:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.151618 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.151927 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.151994 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.152160 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.152244 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:12Z","lastTransitionTime":"2025-12-05T20:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.255138 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.255187 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.255203 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.255221 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.255235 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:12Z","lastTransitionTime":"2025-12-05T20:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.358579 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.358624 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.358639 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.358658 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.358682 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:12Z","lastTransitionTime":"2025-12-05T20:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.421749 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs\") pod \"network-metrics-daemon-2jdj4\" (UID: \"a5c0a952-e24a-49c2-b4ba-e20be61b840d\") " pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:12 crc kubenswrapper[4885]: E1205 20:06:12.421938 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:06:12 crc kubenswrapper[4885]: E1205 20:06:12.421984 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs podName:a5c0a952-e24a-49c2-b4ba-e20be61b840d nodeName:}" failed. No retries permitted until 2025-12-05 20:06:13.421970178 +0000 UTC m=+38.718785829 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs") pod "network-metrics-daemon-2jdj4" (UID: "a5c0a952-e24a-49c2-b4ba-e20be61b840d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.461957 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.462279 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.462375 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.462470 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.462551 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:12Z","lastTransitionTime":"2025-12-05T20:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.497922 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" event={"ID":"f8bd00a1-3879-4791-8e78-150f2a0bf522","Type":"ContainerStarted","Data":"4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd"} Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.497968 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" event={"ID":"f8bd00a1-3879-4791-8e78-150f2a0bf522","Type":"ContainerStarted","Data":"21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2"} Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.518976 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.530642 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.544012 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.560057 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.565227 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.565251 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.565259 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.565272 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.565298 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:12Z","lastTransitionTime":"2025-12-05T20:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.575143 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.591246 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.610694 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.629433 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.646611 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.669434 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.669526 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.669550 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.669583 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.669621 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:12Z","lastTransitionTime":"2025-12-05T20:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.672495 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.687317 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.704416 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.729600 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.771818 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:09Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 20:06:08.368301 6307 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:06:08.368379 6307 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:06:08.368396 6307 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:06:08.368452 6307 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:06:08.368489 6307 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:06:08.368504 6307 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:06:08.368578 6307 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:06:08.368611 6307 factory.go:656] Stopping watch factory\\\\nI1205 20:06:08.368632 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:06:08.368667 6307 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:06:08.368687 6307 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:06:08.368702 6307 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:06:08.368714 6307 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:06:08.368725 6307 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:06:08.368740 6307 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:06:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.772839 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.772918 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.772934 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.772951 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.772963 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:12Z","lastTransitionTime":"2025-12-05T20:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.791933 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.804918 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.874945 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.874985 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.874994 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.875008 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.875021 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:12Z","lastTransitionTime":"2025-12-05T20:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.978163 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.978228 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.978253 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.978281 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:12 crc kubenswrapper[4885]: I1205 20:06:12.978301 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:12Z","lastTransitionTime":"2025-12-05T20:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.081559 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.081915 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.082162 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.082392 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.082607 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:13Z","lastTransitionTime":"2025-12-05T20:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.172114 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.172144 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.172306 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:13 crc kubenswrapper[4885]: E1205 20:06:13.172302 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.172340 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:13 crc kubenswrapper[4885]: E1205 20:06:13.172486 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:13 crc kubenswrapper[4885]: E1205 20:06:13.172583 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:13 crc kubenswrapper[4885]: E1205 20:06:13.173151 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.187289 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.187378 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.187442 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.187475 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.187545 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:13Z","lastTransitionTime":"2025-12-05T20:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.290222 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.290298 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.290322 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.290350 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.290375 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:13Z","lastTransitionTime":"2025-12-05T20:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.393107 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.393175 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.393191 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.393207 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.393217 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:13Z","lastTransitionTime":"2025-12-05T20:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.432960 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs\") pod \"network-metrics-daemon-2jdj4\" (UID: \"a5c0a952-e24a-49c2-b4ba-e20be61b840d\") " pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:13 crc kubenswrapper[4885]: E1205 20:06:13.433162 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:06:13 crc kubenswrapper[4885]: E1205 20:06:13.433230 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs podName:a5c0a952-e24a-49c2-b4ba-e20be61b840d nodeName:}" failed. No retries permitted until 2025-12-05 20:06:15.43320837 +0000 UTC m=+40.730024041 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs") pod "network-metrics-daemon-2jdj4" (UID: "a5c0a952-e24a-49c2-b4ba-e20be61b840d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.496233 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.496283 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.496317 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.496340 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.496358 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:13Z","lastTransitionTime":"2025-12-05T20:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.599726 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.599787 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.599804 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.599831 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.599849 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:13Z","lastTransitionTime":"2025-12-05T20:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.702800 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.702838 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.702846 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.702860 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.702869 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:13Z","lastTransitionTime":"2025-12-05T20:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.805976 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.806273 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.806312 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.806348 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.806380 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:13Z","lastTransitionTime":"2025-12-05T20:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.908845 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.908918 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.908941 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.908971 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:13 crc kubenswrapper[4885]: I1205 20:06:13.908993 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:13Z","lastTransitionTime":"2025-12-05T20:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.012047 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.012086 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.012097 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.012115 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.012128 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:14Z","lastTransitionTime":"2025-12-05T20:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.115104 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.115174 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.115189 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.115210 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.115228 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:14Z","lastTransitionTime":"2025-12-05T20:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.217869 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.217913 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.217922 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.217936 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.217946 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:14Z","lastTransitionTime":"2025-12-05T20:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.320137 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.320200 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.320223 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.320253 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.320274 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:14Z","lastTransitionTime":"2025-12-05T20:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.422854 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.422928 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.422945 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.422968 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.422991 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:14Z","lastTransitionTime":"2025-12-05T20:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.525572 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.525612 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.525622 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.525635 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.525645 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:14Z","lastTransitionTime":"2025-12-05T20:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.628704 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.628765 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.628782 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.628806 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.628822 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:14Z","lastTransitionTime":"2025-12-05T20:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.731962 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.732023 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.732098 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.732136 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.732160 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:14Z","lastTransitionTime":"2025-12-05T20:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.835633 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.835713 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.835730 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.835755 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.835773 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:14Z","lastTransitionTime":"2025-12-05T20:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.939461 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.939531 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.939550 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.939578 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:14 crc kubenswrapper[4885]: I1205 20:06:14.939598 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:14Z","lastTransitionTime":"2025-12-05T20:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.042557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.042623 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.042640 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.042666 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.042684 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:15Z","lastTransitionTime":"2025-12-05T20:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.145780 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.145842 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.145860 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.145886 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.145910 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:15Z","lastTransitionTime":"2025-12-05T20:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.172526 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.172593 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:15 crc kubenswrapper[4885]: E1205 20:06:15.172664 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.172712 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.172821 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:15 crc kubenswrapper[4885]: E1205 20:06:15.172977 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:15 crc kubenswrapper[4885]: E1205 20:06:15.173219 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:15 crc kubenswrapper[4885]: E1205 20:06:15.173495 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.191177 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.212800 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.236860 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.248681 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.248750 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.248775 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.248806 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.248831 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:15Z","lastTransitionTime":"2025-12-05T20:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.262094 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.278225 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.297519 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.317197 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.329718 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.343622 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.351776 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.351840 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.351863 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.351892 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.351914 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:15Z","lastTransitionTime":"2025-12-05T20:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.377849 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:09Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 20:06:08.368301 6307 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:06:08.368379 6307 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:06:08.368396 6307 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:06:08.368452 6307 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:06:08.368489 6307 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:06:08.368504 6307 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:06:08.368578 6307 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:06:08.368611 6307 factory.go:656] Stopping watch factory\\\\nI1205 20:06:08.368632 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:06:08.368667 6307 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:06:08.368687 6307 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:06:08.368702 6307 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:06:08.368714 6307 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:06:08.368725 6307 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:06:08.368740 6307 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:06:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.393125 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.405488 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.425848 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.441084 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.454991 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.455061 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.455078 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.455098 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.455111 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:15Z","lastTransitionTime":"2025-12-05T20:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.455565 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.458124 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs\") pod \"network-metrics-daemon-2jdj4\" (UID: \"a5c0a952-e24a-49c2-b4ba-e20be61b840d\") " pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:15 crc kubenswrapper[4885]: E1205 20:06:15.458296 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:06:15 crc kubenswrapper[4885]: E1205 20:06:15.458360 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs podName:a5c0a952-e24a-49c2-b4ba-e20be61b840d nodeName:}" failed. No retries permitted until 2025-12-05 20:06:19.458343316 +0000 UTC m=+44.755158987 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs") pod "network-metrics-daemon-2jdj4" (UID: "a5c0a952-e24a-49c2-b4ba-e20be61b840d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.471236 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:15Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.557997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.558105 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.558125 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.558148 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.558167 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:15Z","lastTransitionTime":"2025-12-05T20:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.661102 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.661146 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.661157 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.661174 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.661186 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:15Z","lastTransitionTime":"2025-12-05T20:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.763790 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.763947 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.763969 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.763995 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.764014 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:15Z","lastTransitionTime":"2025-12-05T20:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.867317 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.867378 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.867396 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.867422 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.867440 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:15Z","lastTransitionTime":"2025-12-05T20:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.970635 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.970716 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.970733 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.970758 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:15 crc kubenswrapper[4885]: I1205 20:06:15.970779 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:15Z","lastTransitionTime":"2025-12-05T20:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.073950 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.074066 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.074092 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.074129 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.074151 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:16Z","lastTransitionTime":"2025-12-05T20:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.177549 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.177596 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.177609 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.177625 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.177636 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:16Z","lastTransitionTime":"2025-12-05T20:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.280591 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.280633 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.280641 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.280655 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.280664 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:16Z","lastTransitionTime":"2025-12-05T20:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.384005 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.384076 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.384088 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.384106 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.384119 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:16Z","lastTransitionTime":"2025-12-05T20:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.487439 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.487489 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.487499 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.487519 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.487533 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:16Z","lastTransitionTime":"2025-12-05T20:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.590517 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.590585 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.590609 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.590634 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.590651 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:16Z","lastTransitionTime":"2025-12-05T20:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.631974 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.632393 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.632557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.632727 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.632877 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:16Z","lastTransitionTime":"2025-12-05T20:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:16 crc kubenswrapper[4885]: E1205 20:06:16.656741 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.662590 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.663382 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.663409 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.663429 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.663442 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:16Z","lastTransitionTime":"2025-12-05T20:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:16 crc kubenswrapper[4885]: E1205 20:06:16.683759 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.688464 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.688535 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.688549 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.688573 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.688591 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:16Z","lastTransitionTime":"2025-12-05T20:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:16 crc kubenswrapper[4885]: E1205 20:06:16.706461 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.714056 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.714104 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.714116 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.714135 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.714146 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:16Z","lastTransitionTime":"2025-12-05T20:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:16 crc kubenswrapper[4885]: E1205 20:06:16.733438 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.738703 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.738750 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.738763 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.738780 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.738792 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:16Z","lastTransitionTime":"2025-12-05T20:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:16 crc kubenswrapper[4885]: E1205 20:06:16.755890 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:16 crc kubenswrapper[4885]: E1205 20:06:16.756061 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.757657 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.757709 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.757727 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.757748 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.757767 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:16Z","lastTransitionTime":"2025-12-05T20:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.861093 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.861142 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.861156 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.861177 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.861190 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:16Z","lastTransitionTime":"2025-12-05T20:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.963561 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.963603 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.963614 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.963634 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:16 crc kubenswrapper[4885]: I1205 20:06:16.963648 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:16Z","lastTransitionTime":"2025-12-05T20:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.066205 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.066250 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.066261 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.066277 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.066292 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:17Z","lastTransitionTime":"2025-12-05T20:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.169055 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.169136 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.169160 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.169191 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.169215 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:17Z","lastTransitionTime":"2025-12-05T20:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.172376 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.172428 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.172454 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:17 crc kubenswrapper[4885]: E1205 20:06:17.172521 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.172613 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:17 crc kubenswrapper[4885]: E1205 20:06:17.172665 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:17 crc kubenswrapper[4885]: E1205 20:06:17.172749 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:17 crc kubenswrapper[4885]: E1205 20:06:17.173063 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.272715 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.272779 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.272799 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.272827 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.272850 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:17Z","lastTransitionTime":"2025-12-05T20:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.376659 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.376727 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.376744 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.376769 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.376792 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:17Z","lastTransitionTime":"2025-12-05T20:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.480358 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.480465 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.480485 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.480510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.480530 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:17Z","lastTransitionTime":"2025-12-05T20:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.583217 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.583287 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.583298 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.583329 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.583343 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:17Z","lastTransitionTime":"2025-12-05T20:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.686356 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.686459 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.686484 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.686515 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.686533 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:17Z","lastTransitionTime":"2025-12-05T20:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.790163 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.790236 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.790254 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.790274 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.790288 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:17Z","lastTransitionTime":"2025-12-05T20:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.894557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.894631 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.894649 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.894674 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.894697 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:17Z","lastTransitionTime":"2025-12-05T20:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.997913 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.997987 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.998007 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.998071 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:17 crc kubenswrapper[4885]: I1205 20:06:17.998105 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:17Z","lastTransitionTime":"2025-12-05T20:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.102086 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.102153 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.102165 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.102188 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.102204 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:18Z","lastTransitionTime":"2025-12-05T20:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.205829 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.205895 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.205913 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.205944 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.205962 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:18Z","lastTransitionTime":"2025-12-05T20:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.309861 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.309921 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.309936 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.309959 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.309973 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:18Z","lastTransitionTime":"2025-12-05T20:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.413068 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.413158 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.413183 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.413218 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.413243 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:18Z","lastTransitionTime":"2025-12-05T20:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.516669 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.516743 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.516767 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.516797 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.516820 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:18Z","lastTransitionTime":"2025-12-05T20:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.620734 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.620834 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.620858 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.620888 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.620922 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:18Z","lastTransitionTime":"2025-12-05T20:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.724551 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.724630 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.724656 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.724687 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.724706 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:18Z","lastTransitionTime":"2025-12-05T20:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.827823 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.827888 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.827905 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.827928 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.827947 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:18Z","lastTransitionTime":"2025-12-05T20:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.931374 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.931445 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.931469 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.931501 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:18 crc kubenswrapper[4885]: I1205 20:06:18.931524 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:18Z","lastTransitionTime":"2025-12-05T20:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.033734 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.033821 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.033847 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.033884 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.033908 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:19Z","lastTransitionTime":"2025-12-05T20:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.137278 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.137416 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.137446 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.137469 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.137483 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:19Z","lastTransitionTime":"2025-12-05T20:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.171739 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.171863 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.171868 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.171739 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:19 crc kubenswrapper[4885]: E1205 20:06:19.171965 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:19 crc kubenswrapper[4885]: E1205 20:06:19.172127 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:19 crc kubenswrapper[4885]: E1205 20:06:19.172196 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:19 crc kubenswrapper[4885]: E1205 20:06:19.172299 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.240840 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.240895 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.240913 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.240934 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.240949 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:19Z","lastTransitionTime":"2025-12-05T20:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.343939 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.344071 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.344092 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.344116 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.344134 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:19Z","lastTransitionTime":"2025-12-05T20:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.446992 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.447068 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.447081 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.447100 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.447115 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:19Z","lastTransitionTime":"2025-12-05T20:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.505378 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs\") pod \"network-metrics-daemon-2jdj4\" (UID: \"a5c0a952-e24a-49c2-b4ba-e20be61b840d\") " pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:19 crc kubenswrapper[4885]: E1205 20:06:19.505672 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:06:19 crc kubenswrapper[4885]: E1205 20:06:19.505809 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs podName:a5c0a952-e24a-49c2-b4ba-e20be61b840d nodeName:}" failed. No retries permitted until 2025-12-05 20:06:27.505781922 +0000 UTC m=+52.802597583 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs") pod "network-metrics-daemon-2jdj4" (UID: "a5c0a952-e24a-49c2-b4ba-e20be61b840d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.550430 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.550491 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.550502 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.550517 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.550526 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:19Z","lastTransitionTime":"2025-12-05T20:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.654056 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.654129 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.654142 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.654162 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.654173 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:19Z","lastTransitionTime":"2025-12-05T20:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.757267 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.757325 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.757337 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.757359 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.757372 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:19Z","lastTransitionTime":"2025-12-05T20:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.860012 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.860108 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.860130 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.860158 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.860181 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:19Z","lastTransitionTime":"2025-12-05T20:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.963477 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.963554 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.963579 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.963611 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:19 crc kubenswrapper[4885]: I1205 20:06:19.963634 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:19Z","lastTransitionTime":"2025-12-05T20:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.066434 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.066515 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.066539 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.066563 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.066582 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:20Z","lastTransitionTime":"2025-12-05T20:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.169606 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.169682 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.169705 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.169733 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.169754 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:20Z","lastTransitionTime":"2025-12-05T20:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.272817 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.272961 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.272981 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.273005 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.273103 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:20Z","lastTransitionTime":"2025-12-05T20:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.375964 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.376008 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.376045 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.376062 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.376072 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:20Z","lastTransitionTime":"2025-12-05T20:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.479109 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.479169 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.479180 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.479201 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.479216 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:20Z","lastTransitionTime":"2025-12-05T20:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.582853 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.582931 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.582950 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.582974 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.582990 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:20Z","lastTransitionTime":"2025-12-05T20:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.686227 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.686315 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.686343 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.686374 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.686395 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:20Z","lastTransitionTime":"2025-12-05T20:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.789932 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.789989 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.790002 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.790050 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.790065 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:20Z","lastTransitionTime":"2025-12-05T20:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.892872 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.892940 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.892960 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.892986 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.893005 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:20Z","lastTransitionTime":"2025-12-05T20:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.995902 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.995967 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.995991 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.996088 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:20 crc kubenswrapper[4885]: I1205 20:06:20.996120 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:20Z","lastTransitionTime":"2025-12-05T20:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.099482 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.099557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.099580 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.099608 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.099630 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:21Z","lastTransitionTime":"2025-12-05T20:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.171836 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.171876 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.171961 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.171974 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:21 crc kubenswrapper[4885]: E1205 20:06:21.172175 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:21 crc kubenswrapper[4885]: E1205 20:06:21.172323 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:21 crc kubenswrapper[4885]: E1205 20:06:21.172453 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:21 crc kubenswrapper[4885]: E1205 20:06:21.172630 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.202854 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.202915 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.202933 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.202955 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.202972 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:21Z","lastTransitionTime":"2025-12-05T20:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.305581 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.305654 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.305678 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.305706 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.305727 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:21Z","lastTransitionTime":"2025-12-05T20:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.409142 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.409209 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.409232 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.409264 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.409290 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:21Z","lastTransitionTime":"2025-12-05T20:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.512280 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.512344 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.512366 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.512395 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.512418 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:21Z","lastTransitionTime":"2025-12-05T20:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.615449 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.615511 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.615543 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.615573 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.615595 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:21Z","lastTransitionTime":"2025-12-05T20:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.717823 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.717898 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.717941 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.717971 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.717992 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:21Z","lastTransitionTime":"2025-12-05T20:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.821134 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.821210 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.821233 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.821261 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.821281 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:21Z","lastTransitionTime":"2025-12-05T20:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.924704 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.924765 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.924783 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.924817 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:21 crc kubenswrapper[4885]: I1205 20:06:21.924854 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:21Z","lastTransitionTime":"2025-12-05T20:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.027873 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.027944 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.027968 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.027998 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.028060 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:22Z","lastTransitionTime":"2025-12-05T20:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.131115 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.131204 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.131237 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.131264 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.131357 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:22Z","lastTransitionTime":"2025-12-05T20:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.234596 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.234716 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.234740 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.234773 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.234798 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:22Z","lastTransitionTime":"2025-12-05T20:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.338062 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.338147 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.338178 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.338258 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.338282 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:22Z","lastTransitionTime":"2025-12-05T20:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.440967 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.441093 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.441124 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.441169 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.441187 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:22Z","lastTransitionTime":"2025-12-05T20:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.544415 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.544495 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.544523 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.544552 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.544574 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:22Z","lastTransitionTime":"2025-12-05T20:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.647791 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.647866 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.647892 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.647924 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.647946 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:22Z","lastTransitionTime":"2025-12-05T20:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.750691 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.750764 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.750786 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.750815 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.750836 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:22Z","lastTransitionTime":"2025-12-05T20:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.854210 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.854265 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.854288 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.854316 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.854338 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:22Z","lastTransitionTime":"2025-12-05T20:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.956951 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.957065 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.957076 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.957098 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:22 crc kubenswrapper[4885]: I1205 20:06:22.957114 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:22Z","lastTransitionTime":"2025-12-05T20:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.060510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.060573 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.060594 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.060623 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.060643 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:23Z","lastTransitionTime":"2025-12-05T20:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.163720 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.163818 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.163878 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.163914 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.163937 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:23Z","lastTransitionTime":"2025-12-05T20:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.172399 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.172460 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.172492 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:23 crc kubenswrapper[4885]: E1205 20:06:23.172609 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.172627 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:23 crc kubenswrapper[4885]: E1205 20:06:23.172758 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:23 crc kubenswrapper[4885]: E1205 20:06:23.172922 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:23 crc kubenswrapper[4885]: E1205 20:06:23.172999 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.267002 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.267176 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.267191 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.267219 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.267238 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:23Z","lastTransitionTime":"2025-12-05T20:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.370848 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.370919 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.370943 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.370974 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.370997 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:23Z","lastTransitionTime":"2025-12-05T20:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.474459 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.474506 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.474520 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.474538 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.474552 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:23Z","lastTransitionTime":"2025-12-05T20:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.577049 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.577095 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.577107 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.577122 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.577133 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:23Z","lastTransitionTime":"2025-12-05T20:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.680432 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.680507 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.680519 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.680540 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.680555 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:23Z","lastTransitionTime":"2025-12-05T20:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.784290 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.784356 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.784380 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.784410 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.784430 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:23Z","lastTransitionTime":"2025-12-05T20:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.887202 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.887294 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.887349 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.887373 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.887391 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:23Z","lastTransitionTime":"2025-12-05T20:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.991226 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.991310 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.991326 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.991345 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:23 crc kubenswrapper[4885]: I1205 20:06:23.991361 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:23Z","lastTransitionTime":"2025-12-05T20:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.094712 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.094790 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.094814 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.094848 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.094875 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:24Z","lastTransitionTime":"2025-12-05T20:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.198296 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.198351 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.198369 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.198388 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.198403 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:24Z","lastTransitionTime":"2025-12-05T20:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.300720 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.300846 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.300865 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.300891 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.300911 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:24Z","lastTransitionTime":"2025-12-05T20:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.404389 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.404561 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.404588 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.404618 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.404643 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:24Z","lastTransitionTime":"2025-12-05T20:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.507555 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.507610 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.507624 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.507645 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.507658 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:24Z","lastTransitionTime":"2025-12-05T20:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.611217 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.611279 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.611291 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.611321 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.611335 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:24Z","lastTransitionTime":"2025-12-05T20:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.714858 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.714913 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.714925 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.714941 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.714953 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:24Z","lastTransitionTime":"2025-12-05T20:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.818125 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.818202 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.818227 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.818256 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.818278 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:24Z","lastTransitionTime":"2025-12-05T20:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.921626 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.921677 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.921694 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.921719 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:24 crc kubenswrapper[4885]: I1205 20:06:24.921736 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:24Z","lastTransitionTime":"2025-12-05T20:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.024104 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.024167 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.024185 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.024211 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.024228 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:25Z","lastTransitionTime":"2025-12-05T20:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.127558 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.127648 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.127672 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.127700 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.127728 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:25Z","lastTransitionTime":"2025-12-05T20:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.170230 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.170424 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:06:57.170381236 +0000 UTC m=+82.467196937 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.170539 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.170606 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.170660 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.170762 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.170844 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.170881 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.170906 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.170914 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.170967 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.170982 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.170991 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:06:57.170968585 +0000 UTC m=+82.467784286 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.170918 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.171099 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:06:57.171070908 +0000 UTC m=+82.467886599 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.170917 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.171157 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:06:57.17113445 +0000 UTC m=+82.467950131 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.171209 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:06:57.171186892 +0000 UTC m=+82.468002583 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.171687 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.171788 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.171810 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.171863 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.171895 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.171965 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.172170 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:25 crc kubenswrapper[4885]: E1205 20:06:25.172956 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.173391 4885 scope.go:117] "RemoveContainer" containerID="3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.185130 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.206253 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.231357 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.231700 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.231737 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.231520 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.231769 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.232065 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:25Z","lastTransitionTime":"2025-12-05T20:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.247083 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.269614 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.291633 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.302982 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.315118 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.328340 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.334888 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.335120 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.335251 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.335382 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.335520 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:25Z","lastTransitionTime":"2025-12-05T20:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.341704 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.372099 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:09Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 20:06:08.368301 6307 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:06:08.368379 6307 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:06:08.368396 6307 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:06:08.368452 6307 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:06:08.368489 6307 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:06:08.368504 6307 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:06:08.368578 6307 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:06:08.368611 6307 factory.go:656] Stopping watch factory\\\\nI1205 20:06:08.368632 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:06:08.368667 6307 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:06:08.368687 6307 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:06:08.368702 6307 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:06:08.368714 6307 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:06:08.368725 6307 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:06:08.368740 6307 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:06:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.383776 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.397973 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.410920 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.421053 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.433391 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.437262 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.437304 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.437313 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.437327 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.437336 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:25Z","lastTransitionTime":"2025-12-05T20:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.540469 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.540511 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.540522 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.540542 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.540555 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:25Z","lastTransitionTime":"2025-12-05T20:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.549187 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovnkube-controller/1.log" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.551939 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerStarted","Data":"4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32"} Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.552552 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.571439 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.590086 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.605309 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.630349 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.643104 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.643165 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.643178 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.643412 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.643429 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:25Z","lastTransitionTime":"2025-12-05T20:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.650242 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.662059 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.679924 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:09Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 20:06:08.368301 6307 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:06:08.368379 6307 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:06:08.368396 6307 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:06:08.368452 6307 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:06:08.368489 6307 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:06:08.368504 6307 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:06:08.368578 6307 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:06:08.368611 6307 factory.go:656] Stopping watch factory\\\\nI1205 20:06:08.368632 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:06:08.368667 6307 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:06:08.368687 6307 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:06:08.368702 6307 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:06:08.368714 6307 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:06:08.368725 6307 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:06:08.368740 6307 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:06:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.691313 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.703487 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.715667 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.728216 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.736998 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.745998 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.746211 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.746280 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.746368 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.746427 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:25Z","lastTransitionTime":"2025-12-05T20:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.752273 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.766324 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.780312 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.793836 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:25Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.848571 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.848883 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.849063 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.849198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.849314 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:25Z","lastTransitionTime":"2025-12-05T20:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.952090 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.952154 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.952171 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.952194 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:25 crc kubenswrapper[4885]: I1205 20:06:25.952211 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:25Z","lastTransitionTime":"2025-12-05T20:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.055785 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.055832 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.055843 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.055859 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.055871 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:26Z","lastTransitionTime":"2025-12-05T20:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.159478 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.159865 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.160048 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.160187 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.160311 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:26Z","lastTransitionTime":"2025-12-05T20:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.263001 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.263057 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.263069 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.263083 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.263092 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:26Z","lastTransitionTime":"2025-12-05T20:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.366049 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.366100 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.366109 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.366122 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.366132 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:26Z","lastTransitionTime":"2025-12-05T20:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.468979 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.469091 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.469119 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.469149 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.469173 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:26Z","lastTransitionTime":"2025-12-05T20:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.558258 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovnkube-controller/2.log" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.559176 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovnkube-controller/1.log" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.562961 4885 generic.go:334] "Generic (PLEG): container finished" podID="86ae690a-3705-45ae-8816-da5f33d2105e" containerID="4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32" exitCode=1 Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.563021 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerDied","Data":"4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32"} Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.563113 4885 scope.go:117] "RemoveContainer" containerID="3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.565365 4885 scope.go:117] "RemoveContainer" containerID="4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32" Dec 05 20:06:26 crc kubenswrapper[4885]: E1205 20:06:26.566005 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.572056 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.572094 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.572105 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.572121 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.572134 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:26Z","lastTransitionTime":"2025-12-05T20:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.583116 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.600253 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.617980 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.630232 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.645372 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.670269 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3225fef475c12a8a36a8cf9796e12053e2185e6c6a27627c600ab1963e4ab9f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:09Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 20:06:08.368301 6307 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:06:08.368379 6307 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:06:08.368396 6307 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:06:08.368452 6307 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:06:08.368489 6307 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:06:08.368504 6307 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:06:08.368578 6307 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:06:08.368611 6307 factory.go:656] Stopping watch factory\\\\nI1205 20:06:08.368632 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:06:08.368667 6307 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:06:08.368687 6307 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:06:08.368702 6307 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:06:08.368714 6307 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:06:08.368725 6307 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:06:08.368740 6307 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:06:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 20:06:25.996152 6523 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:06:25.995830 6523 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1205 20:06:25.996191 6523 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cni\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.675080 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.675106 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.675115 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.675128 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.675137 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:26Z","lastTransitionTime":"2025-12-05T20:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.689270 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.707866 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.731673 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.746093 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.766500 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.777330 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.777369 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.777382 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.777399 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.777412 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:26Z","lastTransitionTime":"2025-12-05T20:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.783107 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.796700 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.810630 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.825810 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.839545 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.880000 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.880065 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.880081 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.880104 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.880122 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:26Z","lastTransitionTime":"2025-12-05T20:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.926952 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.927054 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.927077 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.927102 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.927124 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:26Z","lastTransitionTime":"2025-12-05T20:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:26 crc kubenswrapper[4885]: E1205 20:06:26.950529 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.955517 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.955580 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.955603 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.955632 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.955653 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:26Z","lastTransitionTime":"2025-12-05T20:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:26 crc kubenswrapper[4885]: E1205 20:06:26.977972 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:26Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.983101 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.983169 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.983192 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.983215 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:26 crc kubenswrapper[4885]: I1205 20:06:26.983232 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:26Z","lastTransitionTime":"2025-12-05T20:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:27 crc kubenswrapper[4885]: E1205 20:06:27.008692 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.014641 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.014715 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.014740 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.014772 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.014795 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:27Z","lastTransitionTime":"2025-12-05T20:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:27 crc kubenswrapper[4885]: E1205 20:06:27.033455 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.038355 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.038394 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.038405 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.038421 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.038432 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:27Z","lastTransitionTime":"2025-12-05T20:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:27 crc kubenswrapper[4885]: E1205 20:06:27.054229 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: E1205 20:06:27.054445 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.056059 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.056125 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.056147 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.056173 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.056193 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:27Z","lastTransitionTime":"2025-12-05T20:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.158559 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.158625 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.158711 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.158745 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.158767 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:27Z","lastTransitionTime":"2025-12-05T20:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.172075 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.172176 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.172240 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.172434 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:27 crc kubenswrapper[4885]: E1205 20:06:27.172416 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:27 crc kubenswrapper[4885]: E1205 20:06:27.172574 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:27 crc kubenswrapper[4885]: E1205 20:06:27.172689 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:27 crc kubenswrapper[4885]: E1205 20:06:27.172838 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.261283 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.261360 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.261372 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.261397 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.261415 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:27Z","lastTransitionTime":"2025-12-05T20:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.364208 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.364292 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.364314 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.364345 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.364368 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:27Z","lastTransitionTime":"2025-12-05T20:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.467745 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.467812 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.467836 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.467865 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.467883 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:27Z","lastTransitionTime":"2025-12-05T20:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.570426 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.570502 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.570526 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.570555 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.570576 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:27Z","lastTransitionTime":"2025-12-05T20:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.572405 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovnkube-controller/2.log" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.578622 4885 scope.go:117] "RemoveContainer" containerID="4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32" Dec 05 20:06:27 crc kubenswrapper[4885]: E1205 20:06:27.578922 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.598375 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs\") pod \"network-metrics-daemon-2jdj4\" (UID: \"a5c0a952-e24a-49c2-b4ba-e20be61b840d\") " pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:27 crc kubenswrapper[4885]: E1205 20:06:27.598793 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:06:27 crc kubenswrapper[4885]: E1205 20:06:27.599112 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs podName:a5c0a952-e24a-49c2-b4ba-e20be61b840d nodeName:}" failed. No retries permitted until 2025-12-05 20:06:43.598988305 +0000 UTC m=+68.895804386 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs") pod "network-metrics-daemon-2jdj4" (UID: "a5c0a952-e24a-49c2-b4ba-e20be61b840d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.600604 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.622985 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.639945 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.654388 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.673613 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.673665 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.673681 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.673701 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.673716 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:27Z","lastTransitionTime":"2025-12-05T20:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.677584 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.698434 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.724769 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 20:06:25.996152 6523 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:06:25.995830 6523 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1205 20:06:25.996191 6523 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cni\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.738387 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.756047 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.775171 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.776651 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.776745 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.776760 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.776785 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.776797 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:27Z","lastTransitionTime":"2025-12-05T20:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.789112 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.804293 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.821086 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.837912 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.855371 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.868299 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:27Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.879973 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.880005 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.880042 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.880064 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.880078 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:27Z","lastTransitionTime":"2025-12-05T20:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.983661 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.983769 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.984230 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.984674 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:27 crc kubenswrapper[4885]: I1205 20:06:27.984744 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:27Z","lastTransitionTime":"2025-12-05T20:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.088251 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.088310 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.088328 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.088357 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.088375 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:28Z","lastTransitionTime":"2025-12-05T20:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.191114 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.191160 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.191168 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.191182 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.191190 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:28Z","lastTransitionTime":"2025-12-05T20:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.293703 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.294211 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.294317 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.294422 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.294543 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:28Z","lastTransitionTime":"2025-12-05T20:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.397775 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.397837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.397855 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.397881 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.397903 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:28Z","lastTransitionTime":"2025-12-05T20:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.501224 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.501278 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.501294 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.501315 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.501332 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:28Z","lastTransitionTime":"2025-12-05T20:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.604559 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.604647 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.604671 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.604697 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.604724 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:28Z","lastTransitionTime":"2025-12-05T20:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.708190 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.708256 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.708279 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.708309 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.708335 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:28Z","lastTransitionTime":"2025-12-05T20:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.811761 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.811818 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.811837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.811865 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.811886 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:28Z","lastTransitionTime":"2025-12-05T20:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.913990 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.914051 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.914062 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.914079 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:28 crc kubenswrapper[4885]: I1205 20:06:28.914089 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:28Z","lastTransitionTime":"2025-12-05T20:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.017330 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.017483 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.017510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.017545 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.017570 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:29Z","lastTransitionTime":"2025-12-05T20:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.120482 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.120572 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.120589 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.120613 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.120630 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:29Z","lastTransitionTime":"2025-12-05T20:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.171922 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.171948 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.171966 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.172038 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:29 crc kubenswrapper[4885]: E1205 20:06:29.172131 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:29 crc kubenswrapper[4885]: E1205 20:06:29.172204 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:29 crc kubenswrapper[4885]: E1205 20:06:29.172331 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:29 crc kubenswrapper[4885]: E1205 20:06:29.172438 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.222610 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.222650 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.222659 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.222675 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.222685 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:29Z","lastTransitionTime":"2025-12-05T20:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.240752 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.250285 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.255654 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.274447 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.292917 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.308706 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.319186 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.325945 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.326053 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.326082 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.326112 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.326165 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:29Z","lastTransitionTime":"2025-12-05T20:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.338671 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.358830 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.376098 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.393622 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.415530 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.428718 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.428808 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.428837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.428869 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.428893 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:29Z","lastTransitionTime":"2025-12-05T20:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.437171 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.452570 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.468106 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.486997 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.503169 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.531473 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.531538 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.531560 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.531586 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.531605 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:29Z","lastTransitionTime":"2025-12-05T20:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.535796 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 20:06:25.996152 6523 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:06:25.995830 6523 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1205 20:06:25.996191 6523 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cni\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.634001 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.634089 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.634107 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.634130 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.634147 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:29Z","lastTransitionTime":"2025-12-05T20:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.737799 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.737895 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.737917 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.738335 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.738356 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:29Z","lastTransitionTime":"2025-12-05T20:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.842060 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.842119 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.842141 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.842172 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.842196 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:29Z","lastTransitionTime":"2025-12-05T20:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.945947 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.946010 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.946076 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.946100 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:29 crc kubenswrapper[4885]: I1205 20:06:29.946118 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:29Z","lastTransitionTime":"2025-12-05T20:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.049298 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.049365 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.049388 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.049417 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.049439 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:30Z","lastTransitionTime":"2025-12-05T20:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.153361 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.153431 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.153453 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.153480 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.153501 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:30Z","lastTransitionTime":"2025-12-05T20:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.256133 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.256179 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.256193 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.256212 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.256229 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:30Z","lastTransitionTime":"2025-12-05T20:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.359623 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.359682 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.359698 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.359720 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.359735 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:30Z","lastTransitionTime":"2025-12-05T20:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.463218 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.463293 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.463319 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.463349 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.463367 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:30Z","lastTransitionTime":"2025-12-05T20:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.566158 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.566204 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.566216 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.566232 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.566244 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:30Z","lastTransitionTime":"2025-12-05T20:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.669243 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.669292 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.669305 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.669329 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.669339 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:30Z","lastTransitionTime":"2025-12-05T20:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.771941 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.772051 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.772065 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.772094 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.772111 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:30Z","lastTransitionTime":"2025-12-05T20:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.874916 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.874979 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.874991 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.875040 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.875080 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:30Z","lastTransitionTime":"2025-12-05T20:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.977994 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.978074 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.978084 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.978101 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:30 crc kubenswrapper[4885]: I1205 20:06:30.978112 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:30Z","lastTransitionTime":"2025-12-05T20:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.081912 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.081999 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.082072 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.082105 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.082124 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:31Z","lastTransitionTime":"2025-12-05T20:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.172343 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.172430 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:31 crc kubenswrapper[4885]: E1205 20:06:31.172534 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.172421 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:31 crc kubenswrapper[4885]: E1205 20:06:31.172663 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.172434 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:31 crc kubenswrapper[4885]: E1205 20:06:31.172841 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:31 crc kubenswrapper[4885]: E1205 20:06:31.172965 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.184686 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.184731 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.184746 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.184764 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.184782 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:31Z","lastTransitionTime":"2025-12-05T20:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.288168 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.288255 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.288283 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.288317 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.288343 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:31Z","lastTransitionTime":"2025-12-05T20:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.391778 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.391903 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.391926 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.391966 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.391999 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:31Z","lastTransitionTime":"2025-12-05T20:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.495502 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.495686 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.495719 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.495750 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.495780 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:31Z","lastTransitionTime":"2025-12-05T20:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.598250 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.598307 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.598326 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.598350 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.598368 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:31Z","lastTransitionTime":"2025-12-05T20:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.701985 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.702079 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.702097 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.702120 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.702137 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:31Z","lastTransitionTime":"2025-12-05T20:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.804516 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.804592 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.804616 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.804646 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.804668 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:31Z","lastTransitionTime":"2025-12-05T20:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.907697 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.907760 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.907791 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.907828 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:31 crc kubenswrapper[4885]: I1205 20:06:31.907849 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:31Z","lastTransitionTime":"2025-12-05T20:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.010472 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.010515 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.010534 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.010553 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.010567 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:32Z","lastTransitionTime":"2025-12-05T20:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.113725 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.113781 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.113802 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.113829 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.113851 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:32Z","lastTransitionTime":"2025-12-05T20:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.216705 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.216782 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.216808 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.216837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.216858 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:32Z","lastTransitionTime":"2025-12-05T20:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.320498 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.320570 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.320594 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.320627 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.320653 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:32Z","lastTransitionTime":"2025-12-05T20:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.423056 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.423105 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.423118 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.423136 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.423148 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:32Z","lastTransitionTime":"2025-12-05T20:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.526360 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.526407 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.526418 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.526435 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.526447 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:32Z","lastTransitionTime":"2025-12-05T20:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.628744 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.628806 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.628829 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.628856 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.628877 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:32Z","lastTransitionTime":"2025-12-05T20:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.731948 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.731999 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.732054 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.732086 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.732108 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:32Z","lastTransitionTime":"2025-12-05T20:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.835485 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.835562 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.835572 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.835593 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.835608 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:32Z","lastTransitionTime":"2025-12-05T20:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.938174 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.938230 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.938239 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.938259 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:32 crc kubenswrapper[4885]: I1205 20:06:32.938276 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:32Z","lastTransitionTime":"2025-12-05T20:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.041141 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.041202 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.041212 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.041233 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.041247 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:33Z","lastTransitionTime":"2025-12-05T20:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.143515 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.143578 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.143596 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.143627 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.143651 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:33Z","lastTransitionTime":"2025-12-05T20:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.172291 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.172398 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.172399 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:33 crc kubenswrapper[4885]: E1205 20:06:33.172485 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.172540 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:33 crc kubenswrapper[4885]: E1205 20:06:33.172671 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:33 crc kubenswrapper[4885]: E1205 20:06:33.172804 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:33 crc kubenswrapper[4885]: E1205 20:06:33.173869 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.246809 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.246868 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.246879 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.246898 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.246910 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:33Z","lastTransitionTime":"2025-12-05T20:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.350298 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.350329 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.350338 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.350353 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.350364 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:33Z","lastTransitionTime":"2025-12-05T20:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.452920 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.452994 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.453011 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.453069 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.453086 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:33Z","lastTransitionTime":"2025-12-05T20:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.555475 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.555509 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.555518 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.555536 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.555546 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:33Z","lastTransitionTime":"2025-12-05T20:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.658378 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.658429 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.658437 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.658451 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.658460 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:33Z","lastTransitionTime":"2025-12-05T20:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.762210 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.762267 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.762283 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.762309 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.762327 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:33Z","lastTransitionTime":"2025-12-05T20:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.865160 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.865203 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.865218 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.865258 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.865270 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:33Z","lastTransitionTime":"2025-12-05T20:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.968235 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.968294 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.968311 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.968334 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:33 crc kubenswrapper[4885]: I1205 20:06:33.968350 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:33Z","lastTransitionTime":"2025-12-05T20:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.071340 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.071425 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.071454 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.071485 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.071507 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:34Z","lastTransitionTime":"2025-12-05T20:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.174513 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.174568 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.174582 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.174599 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.174614 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:34Z","lastTransitionTime":"2025-12-05T20:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.278200 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.278270 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.278295 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.278320 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.278338 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:34Z","lastTransitionTime":"2025-12-05T20:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.381425 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.381549 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.381569 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.381592 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.381611 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:34Z","lastTransitionTime":"2025-12-05T20:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.484510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.484565 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.484582 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.484604 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.484621 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:34Z","lastTransitionTime":"2025-12-05T20:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.587809 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.587869 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.587887 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.587910 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.587927 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:34Z","lastTransitionTime":"2025-12-05T20:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.690332 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.690407 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.690431 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.690463 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.690497 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:34Z","lastTransitionTime":"2025-12-05T20:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.793735 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.793808 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.793826 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.793851 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.793869 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:34Z","lastTransitionTime":"2025-12-05T20:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.897259 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.897321 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.897339 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.897368 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:34 crc kubenswrapper[4885]: I1205 20:06:34.897392 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:34Z","lastTransitionTime":"2025-12-05T20:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.000183 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.000257 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.000272 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.000295 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.000307 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:35Z","lastTransitionTime":"2025-12-05T20:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.105423 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.105578 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.105602 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.105631 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.105652 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:35Z","lastTransitionTime":"2025-12-05T20:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.172445 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.172510 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.172471 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:35 crc kubenswrapper[4885]: E1205 20:06:35.172628 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:35 crc kubenswrapper[4885]: E1205 20:06:35.172803 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:35 crc kubenswrapper[4885]: E1205 20:06:35.173005 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.173148 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:35 crc kubenswrapper[4885]: E1205 20:06:35.173305 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.195769 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.208273 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.208314 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.208328 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.208349 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.208364 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:35Z","lastTransitionTime":"2025-12-05T20:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.214391 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.231265 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.255051 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.277879 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.294424 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.319699 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.320122 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.320310 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.320474 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.320685 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:35Z","lastTransitionTime":"2025-12-05T20:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.351584 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 20:06:25.996152 6523 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:06:25.995830 6523 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1205 20:06:25.996191 6523 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cni\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.369060 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0b551c2-e21f-4c68-93e8-b3865710c748\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55d112bc62087d911c13b8a28f8d3d57d83b8a3946f4d5003592be953f5bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2407c87ee202205691e8650387a082757f38bbfc3271575f6936d1b25f81ecda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5203d8faff0bf21cb02982db400e7803cbbd1caa8febda97f8b0c4cea1dcc48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.383174 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.395138 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.410802 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.423787 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.423821 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.423832 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.423849 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.423860 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:35Z","lastTransitionTime":"2025-12-05T20:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.428670 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.449496 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.467446 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.482963 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.500559 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.516695 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.525961 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.526031 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.526046 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.526063 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.526075 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:35Z","lastTransitionTime":"2025-12-05T20:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.628533 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.628860 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.628873 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.628890 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.628899 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:35Z","lastTransitionTime":"2025-12-05T20:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.731689 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.731750 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.731764 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.731788 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.731801 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:35Z","lastTransitionTime":"2025-12-05T20:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.834331 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.834407 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.834430 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.834460 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.834481 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:35Z","lastTransitionTime":"2025-12-05T20:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.937773 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.937829 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.937845 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.937890 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:35 crc kubenswrapper[4885]: I1205 20:06:35.937906 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:35Z","lastTransitionTime":"2025-12-05T20:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.042923 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.043051 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.043088 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.043118 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.043139 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:36Z","lastTransitionTime":"2025-12-05T20:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.146754 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.146821 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.146837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.146861 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.146878 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:36Z","lastTransitionTime":"2025-12-05T20:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.249424 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.249484 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.249507 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.249537 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.249561 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:36Z","lastTransitionTime":"2025-12-05T20:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.352227 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.352275 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.352286 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.352303 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.352314 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:36Z","lastTransitionTime":"2025-12-05T20:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.454596 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.454663 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.454680 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.454703 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.454719 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:36Z","lastTransitionTime":"2025-12-05T20:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.557738 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.557809 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.557827 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.557854 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.557870 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:36Z","lastTransitionTime":"2025-12-05T20:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.660691 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.660736 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.660749 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.660771 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.660784 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:36Z","lastTransitionTime":"2025-12-05T20:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.763812 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.763880 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.763950 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.763982 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.764005 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:36Z","lastTransitionTime":"2025-12-05T20:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.866776 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.866847 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.866865 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.866891 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.866910 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:36Z","lastTransitionTime":"2025-12-05T20:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.969908 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.969990 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.970015 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.970085 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:36 crc kubenswrapper[4885]: I1205 20:06:36.970106 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:36Z","lastTransitionTime":"2025-12-05T20:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.072972 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.073036 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.073046 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.073061 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.073071 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:37Z","lastTransitionTime":"2025-12-05T20:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.171938 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.171943 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:37 crc kubenswrapper[4885]: E1205 20:06:37.172674 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.172733 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.172797 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:37 crc kubenswrapper[4885]: E1205 20:06:37.172891 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.176544 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.176589 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.176604 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.176627 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.176640 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:37Z","lastTransitionTime":"2025-12-05T20:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:37 crc kubenswrapper[4885]: E1205 20:06:37.178385 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:37 crc kubenswrapper[4885]: E1205 20:06:37.178385 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.279451 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.279526 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.279544 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.279569 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.279588 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:37Z","lastTransitionTime":"2025-12-05T20:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.356440 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.356503 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.356524 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.356548 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.356566 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:37Z","lastTransitionTime":"2025-12-05T20:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:37 crc kubenswrapper[4885]: E1205 20:06:37.380363 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.385752 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.385798 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.385809 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.385825 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.385836 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:37Z","lastTransitionTime":"2025-12-05T20:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:37 crc kubenswrapper[4885]: E1205 20:06:37.406398 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.410438 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.410465 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.410475 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.410488 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.410499 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:37Z","lastTransitionTime":"2025-12-05T20:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:37 crc kubenswrapper[4885]: E1205 20:06:37.427641 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.433314 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.433384 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.433408 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.433439 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.433465 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:37Z","lastTransitionTime":"2025-12-05T20:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:37 crc kubenswrapper[4885]: E1205 20:06:37.449567 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.453696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.453739 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.453785 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.453808 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.453823 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:37Z","lastTransitionTime":"2025-12-05T20:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:37 crc kubenswrapper[4885]: E1205 20:06:37.467984 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:37 crc kubenswrapper[4885]: E1205 20:06:37.468161 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.470496 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.470530 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.470541 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.470557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.470569 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:37Z","lastTransitionTime":"2025-12-05T20:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.573661 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.573725 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.573741 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.573764 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.573782 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:37Z","lastTransitionTime":"2025-12-05T20:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.676337 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.676384 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.676398 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.676416 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.676429 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:37Z","lastTransitionTime":"2025-12-05T20:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.779160 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.779227 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.779247 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.779276 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.779297 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:37Z","lastTransitionTime":"2025-12-05T20:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.882109 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.882207 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.882223 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.882247 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.882265 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:37Z","lastTransitionTime":"2025-12-05T20:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.985010 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.985086 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.985097 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.985117 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:37 crc kubenswrapper[4885]: I1205 20:06:37.985131 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:37Z","lastTransitionTime":"2025-12-05T20:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.087397 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.087476 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.087488 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.087508 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.087523 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:38Z","lastTransitionTime":"2025-12-05T20:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.189917 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.189960 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.189970 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.189984 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.189994 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:38Z","lastTransitionTime":"2025-12-05T20:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.293112 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.293170 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.293181 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.293221 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.293236 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:38Z","lastTransitionTime":"2025-12-05T20:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.396200 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.396247 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.396260 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.396274 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.396285 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:38Z","lastTransitionTime":"2025-12-05T20:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.499148 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.499215 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.499233 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.499262 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.499279 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:38Z","lastTransitionTime":"2025-12-05T20:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.607410 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.607465 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.607486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.607517 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.607541 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:38Z","lastTransitionTime":"2025-12-05T20:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.709603 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.709654 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.709663 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.709680 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.709689 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:38Z","lastTransitionTime":"2025-12-05T20:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.812462 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.812500 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.812526 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.812539 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.812548 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:38Z","lastTransitionTime":"2025-12-05T20:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.914761 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.914815 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.914827 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.914845 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:38 crc kubenswrapper[4885]: I1205 20:06:38.914860 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:38Z","lastTransitionTime":"2025-12-05T20:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.017423 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.017451 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.017463 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.017477 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.017504 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:39Z","lastTransitionTime":"2025-12-05T20:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.120138 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.120172 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.120181 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.120193 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.120201 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:39Z","lastTransitionTime":"2025-12-05T20:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.171701 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.171717 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.171717 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:39 crc kubenswrapper[4885]: E1205 20:06:39.171812 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.171862 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:39 crc kubenswrapper[4885]: E1205 20:06:39.172007 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:39 crc kubenswrapper[4885]: E1205 20:06:39.172231 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:39 crc kubenswrapper[4885]: E1205 20:06:39.172336 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.222693 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.222725 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.222737 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.222752 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.222764 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:39Z","lastTransitionTime":"2025-12-05T20:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.324934 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.324995 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.325011 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.325069 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.325095 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:39Z","lastTransitionTime":"2025-12-05T20:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.427187 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.427267 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.427280 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.427295 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.427306 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:39Z","lastTransitionTime":"2025-12-05T20:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.529802 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.529830 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.529841 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.529855 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.529866 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:39Z","lastTransitionTime":"2025-12-05T20:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.631341 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.631376 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.631388 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.631404 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.631416 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:39Z","lastTransitionTime":"2025-12-05T20:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.734096 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.734130 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.734141 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.734158 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.734169 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:39Z","lastTransitionTime":"2025-12-05T20:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.835767 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.835798 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.835807 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.835819 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.835827 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:39Z","lastTransitionTime":"2025-12-05T20:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.938708 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.938796 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.938820 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.938849 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:39 crc kubenswrapper[4885]: I1205 20:06:39.938871 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:39Z","lastTransitionTime":"2025-12-05T20:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.042217 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.042278 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.042294 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.042318 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.042335 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:40Z","lastTransitionTime":"2025-12-05T20:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.145159 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.145209 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.145226 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.145247 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.145262 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:40Z","lastTransitionTime":"2025-12-05T20:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.254639 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.254685 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.254701 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.254723 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.254739 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:40Z","lastTransitionTime":"2025-12-05T20:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.358043 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.358099 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.358116 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.358138 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.358155 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:40Z","lastTransitionTime":"2025-12-05T20:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.461999 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.462085 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.462103 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.462126 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.462151 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:40Z","lastTransitionTime":"2025-12-05T20:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.563997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.564090 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.564100 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.564115 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.564126 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:40Z","lastTransitionTime":"2025-12-05T20:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.667365 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.667423 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.667439 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.667462 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.667480 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:40Z","lastTransitionTime":"2025-12-05T20:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.770692 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.770751 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.770766 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.770787 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.770801 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:40Z","lastTransitionTime":"2025-12-05T20:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.873214 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.873324 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.873337 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.873354 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.873367 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:40Z","lastTransitionTime":"2025-12-05T20:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.976180 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.976235 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.976247 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.976263 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:40 crc kubenswrapper[4885]: I1205 20:06:40.976275 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:40Z","lastTransitionTime":"2025-12-05T20:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.078660 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.078693 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.078706 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.078724 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.078734 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:41Z","lastTransitionTime":"2025-12-05T20:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.171922 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.171970 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.172070 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:41 crc kubenswrapper[4885]: E1205 20:06:41.172066 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:41 crc kubenswrapper[4885]: E1205 20:06:41.172182 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.171932 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:41 crc kubenswrapper[4885]: E1205 20:06:41.172249 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:41 crc kubenswrapper[4885]: E1205 20:06:41.172300 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.181540 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.181605 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.181622 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.181648 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.181665 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:41Z","lastTransitionTime":"2025-12-05T20:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.283664 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.283707 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.283719 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.283734 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.283747 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:41Z","lastTransitionTime":"2025-12-05T20:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.385962 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.385999 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.386010 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.386043 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.386055 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:41Z","lastTransitionTime":"2025-12-05T20:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.488774 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.489243 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.489394 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.489592 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.489765 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:41Z","lastTransitionTime":"2025-12-05T20:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.592278 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.592316 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.592327 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.592343 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.592355 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:41Z","lastTransitionTime":"2025-12-05T20:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.695701 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.695763 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.695775 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.695791 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.695802 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:41Z","lastTransitionTime":"2025-12-05T20:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.798166 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.798434 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.798494 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.798552 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.798613 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:41Z","lastTransitionTime":"2025-12-05T20:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.901460 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.901799 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.902331 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.902689 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:41 crc kubenswrapper[4885]: I1205 20:06:41.902877 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:41Z","lastTransitionTime":"2025-12-05T20:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.006213 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.006294 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.006318 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.006349 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.006372 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:42Z","lastTransitionTime":"2025-12-05T20:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.108615 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.108690 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.108735 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.108770 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.108795 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:42Z","lastTransitionTime":"2025-12-05T20:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.173556 4885 scope.go:117] "RemoveContainer" containerID="4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32" Dec 05 20:06:42 crc kubenswrapper[4885]: E1205 20:06:42.173825 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.211922 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.211984 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.212006 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.212062 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.212080 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:42Z","lastTransitionTime":"2025-12-05T20:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.315232 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.315287 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.315306 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.315329 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.315350 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:42Z","lastTransitionTime":"2025-12-05T20:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.418124 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.418165 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.418178 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.418196 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.418208 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:42Z","lastTransitionTime":"2025-12-05T20:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.520641 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.520687 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.520703 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.520723 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.520738 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:42Z","lastTransitionTime":"2025-12-05T20:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.623447 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.623513 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.623526 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.623552 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.623567 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:42Z","lastTransitionTime":"2025-12-05T20:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.726086 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.726131 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.726143 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.726159 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.726169 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:42Z","lastTransitionTime":"2025-12-05T20:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.828666 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.828722 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.828734 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.828748 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.828757 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:42Z","lastTransitionTime":"2025-12-05T20:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.931442 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.931501 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.931530 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.931555 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:42 crc kubenswrapper[4885]: I1205 20:06:42.931572 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:42Z","lastTransitionTime":"2025-12-05T20:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.033693 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.033721 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.033730 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.033743 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.033751 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:43Z","lastTransitionTime":"2025-12-05T20:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.137110 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.137189 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.137210 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.137234 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.137254 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:43Z","lastTransitionTime":"2025-12-05T20:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.171980 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.172091 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.172115 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.172113 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:43 crc kubenswrapper[4885]: E1205 20:06:43.172251 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:43 crc kubenswrapper[4885]: E1205 20:06:43.172319 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:43 crc kubenswrapper[4885]: E1205 20:06:43.172406 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:43 crc kubenswrapper[4885]: E1205 20:06:43.172548 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.240375 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.240421 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.240432 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.240448 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.240462 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:43Z","lastTransitionTime":"2025-12-05T20:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.344270 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.344330 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.344348 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.344372 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.344394 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:43Z","lastTransitionTime":"2025-12-05T20:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.447235 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.447337 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.447359 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.447389 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.447413 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:43Z","lastTransitionTime":"2025-12-05T20:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.549752 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.549817 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.549831 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.549852 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.549864 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:43Z","lastTransitionTime":"2025-12-05T20:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.652269 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.652389 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.652417 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.652450 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.652476 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:43Z","lastTransitionTime":"2025-12-05T20:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.683617 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs\") pod \"network-metrics-daemon-2jdj4\" (UID: \"a5c0a952-e24a-49c2-b4ba-e20be61b840d\") " pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:43 crc kubenswrapper[4885]: E1205 20:06:43.683732 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:06:43 crc kubenswrapper[4885]: E1205 20:06:43.683784 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs podName:a5c0a952-e24a-49c2-b4ba-e20be61b840d nodeName:}" failed. No retries permitted until 2025-12-05 20:07:15.683768333 +0000 UTC m=+100.980583994 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs") pod "network-metrics-daemon-2jdj4" (UID: "a5c0a952-e24a-49c2-b4ba-e20be61b840d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.755047 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.755095 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.755104 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.755118 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.755129 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:43Z","lastTransitionTime":"2025-12-05T20:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.858443 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.858485 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.858494 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.858507 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.858517 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:43Z","lastTransitionTime":"2025-12-05T20:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.960411 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.960473 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.960492 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.960515 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:43 crc kubenswrapper[4885]: I1205 20:06:43.960532 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:43Z","lastTransitionTime":"2025-12-05T20:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.062576 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.062630 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.062641 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.062659 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.062670 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:44Z","lastTransitionTime":"2025-12-05T20:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.164807 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.164859 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.164874 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.164892 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.164904 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:44Z","lastTransitionTime":"2025-12-05T20:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.268195 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.268230 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.268241 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.268256 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.268266 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:44Z","lastTransitionTime":"2025-12-05T20:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.370525 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.370565 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.370576 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.370592 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.370603 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:44Z","lastTransitionTime":"2025-12-05T20:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.473378 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.473433 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.473451 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.473471 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.473487 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:44Z","lastTransitionTime":"2025-12-05T20:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.575558 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.575589 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.575598 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.575609 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.575620 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:44Z","lastTransitionTime":"2025-12-05T20:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.633739 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmtwj_c6c25e90-efcc-490c-afef-970c3a62c809/kube-multus/0.log" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.633788 4885 generic.go:334] "Generic (PLEG): container finished" podID="c6c25e90-efcc-490c-afef-970c3a62c809" containerID="245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d" exitCode=1 Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.633832 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmtwj" event={"ID":"c6c25e90-efcc-490c-afef-970c3a62c809","Type":"ContainerDied","Data":"245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d"} Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.634333 4885 scope.go:117] "RemoveContainer" containerID="245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.648900 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0b551c2-e21f-4c68-93e8-b3865710c748\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55d112bc62087d911c13b8a28f8d3d57d83b8a3946f4d5003592be953f5bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2407c87ee202205691e8650387a082757f38bbfc3271575f6936d1b25f81ecda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5203d8faff0bf21cb02982db400e7803cbbd1caa8febda97f8b0c4cea1dcc48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.667194 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.677892 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.677932 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.677944 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.677959 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.677968 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:44Z","lastTransitionTime":"2025-12-05T20:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.679933 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.698614 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 20:06:25.996152 6523 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:06:25.995830 6523 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1205 20:06:25.996191 6523 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cni\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.709619 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.723328 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.735286 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.749675 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.759913 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.779070 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.779874 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.779902 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.779914 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.779929 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.779940 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:44Z","lastTransitionTime":"2025-12-05T20:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.792899 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.804228 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:43Z\\\",\\\"message\\\":\\\"2025-12-05T20:05:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0a592032-b29a-4479-b0f7-86dba8fdbc9a\\\\n2025-12-05T20:05:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0a592032-b29a-4479-b0f7-86dba8fdbc9a to /host/opt/cni/bin/\\\\n2025-12-05T20:05:58Z [verbose] multus-daemon started\\\\n2025-12-05T20:05:58Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:06:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.813891 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.826692 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.841532 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.853047 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.863292 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:44Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.881906 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.881939 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.881946 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.881960 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.881969 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:44Z","lastTransitionTime":"2025-12-05T20:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.984546 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.984593 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.984602 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.984617 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:44 crc kubenswrapper[4885]: I1205 20:06:44.984626 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:44Z","lastTransitionTime":"2025-12-05T20:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.087308 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.087340 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.087351 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.087368 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.087379 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:45Z","lastTransitionTime":"2025-12-05T20:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.172386 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.172429 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:45 crc kubenswrapper[4885]: E1205 20:06:45.172531 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.172757 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.172796 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:45 crc kubenswrapper[4885]: E1205 20:06:45.172893 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:45 crc kubenswrapper[4885]: E1205 20:06:45.173361 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:45 crc kubenswrapper[4885]: E1205 20:06:45.173488 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.189436 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.189465 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.189474 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.189485 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.189496 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:45Z","lastTransitionTime":"2025-12-05T20:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.191482 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.205533 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.223213 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.240079 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:43Z\\\",\\\"message\\\":\\\"2025-12-05T20:05:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0a592032-b29a-4479-b0f7-86dba8fdbc9a\\\\n2025-12-05T20:05:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0a592032-b29a-4479-b0f7-86dba8fdbc9a to /host/opt/cni/bin/\\\\n2025-12-05T20:05:58Z [verbose] multus-daemon started\\\\n2025-12-05T20:05:58Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:06:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.252659 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.272469 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.287373 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.291304 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.291352 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.291363 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.291381 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.291392 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:45Z","lastTransitionTime":"2025-12-05T20:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.296980 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.307239 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.330745 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 20:06:25.996152 6523 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:06:25.995830 6523 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1205 20:06:25.996191 6523 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cni\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.342503 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0b551c2-e21f-4c68-93e8-b3865710c748\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55d112bc62087d911c13b8a28f8d3d57d83b8a3946f4d5003592be953f5bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2407c87ee202205691e8650387a082757f38bbfc3271575f6936d1b25f81ecda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5203d8faff0bf21cb02982db400e7803cbbd1caa8febda97f8b0c4cea1dcc48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.356204 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.365191 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.377630 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.386981 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.393144 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.393182 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.393192 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.393206 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.393215 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:45Z","lastTransitionTime":"2025-12-05T20:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.397733 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.409202 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.495905 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.495968 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.495980 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.495998 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.496010 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:45Z","lastTransitionTime":"2025-12-05T20:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.598741 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.598780 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.598789 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.598802 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.598811 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:45Z","lastTransitionTime":"2025-12-05T20:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.638646 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmtwj_c6c25e90-efcc-490c-afef-970c3a62c809/kube-multus/0.log" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.638706 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmtwj" event={"ID":"c6c25e90-efcc-490c-afef-970c3a62c809","Type":"ContainerStarted","Data":"23633e674cb5832d0d0815f1e0ef1b70ffa2e6c2d92c3fc60d46c9ff7d4cc9ab"} Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.651620 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.666598 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.676560 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.690144 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.702001 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.702040 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.702048 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.702061 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.702069 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:45Z","lastTransitionTime":"2025-12-05T20:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.707315 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.722593 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.736987 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.750395 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.765257 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23633e674cb5832d0d0815f1e0ef1b70ffa2e6c2d92c3fc60d46c9ff7d4cc9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:43Z\\\",\\\"message\\\":\\\"2025-12-05T20:05:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0a592032-b29a-4479-b0f7-86dba8fdbc9a\\\\n2025-12-05T20:05:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0a592032-b29a-4479-b0f7-86dba8fdbc9a to /host/opt/cni/bin/\\\\n2025-12-05T20:05:58Z [verbose] multus-daemon started\\\\n2025-12-05T20:05:58Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:06:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.778431 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.796314 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.804874 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.804904 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.804913 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.805064 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.805078 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:45Z","lastTransitionTime":"2025-12-05T20:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.814464 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.848534 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.861579 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.887412 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 20:06:25.996152 6523 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:06:25.995830 6523 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1205 20:06:25.996191 6523 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cni\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.900136 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0b551c2-e21f-4c68-93e8-b3865710c748\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55d112bc62087d911c13b8a28f8d3d57d83b8a3946f4d5003592be953f5bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2407c87ee202205691e8650387a082757f38bbfc3271575f6936d1b25f81ecda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5203d8faff0bf21cb02982db400e7803cbbd1caa8febda97f8b0c4cea1dcc48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.907556 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.907593 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.907608 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.907627 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.907641 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:45Z","lastTransitionTime":"2025-12-05T20:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:45 crc kubenswrapper[4885]: I1205 20:06:45.913549 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.009730 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.009760 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.009769 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.009781 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.009789 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:46Z","lastTransitionTime":"2025-12-05T20:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.112422 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.112467 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.112480 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.112498 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.112510 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:46Z","lastTransitionTime":"2025-12-05T20:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.214916 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.214983 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.215005 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.215078 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.215106 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:46Z","lastTransitionTime":"2025-12-05T20:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.317964 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.318064 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.318081 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.318100 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.318117 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:46Z","lastTransitionTime":"2025-12-05T20:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.420580 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.420618 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.420626 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.420639 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.420649 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:46Z","lastTransitionTime":"2025-12-05T20:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.522877 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.522934 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.522948 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.522964 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.522977 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:46Z","lastTransitionTime":"2025-12-05T20:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.625746 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.625784 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.625796 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.625814 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.625826 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:46Z","lastTransitionTime":"2025-12-05T20:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.727549 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.727589 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.727599 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.727613 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.727622 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:46Z","lastTransitionTime":"2025-12-05T20:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.831201 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.831272 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.831291 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.831321 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.831343 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:46Z","lastTransitionTime":"2025-12-05T20:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.935428 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.935455 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.935464 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.935475 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:46 crc kubenswrapper[4885]: I1205 20:06:46.935483 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:46Z","lastTransitionTime":"2025-12-05T20:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.038121 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.038152 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.038160 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.038172 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.038180 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:47Z","lastTransitionTime":"2025-12-05T20:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.140226 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.140516 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.140648 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.140802 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.140929 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:47Z","lastTransitionTime":"2025-12-05T20:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.171940 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.171944 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.171955 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:47 crc kubenswrapper[4885]: E1205 20:06:47.172543 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:47 crc kubenswrapper[4885]: E1205 20:06:47.172453 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.172038 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:47 crc kubenswrapper[4885]: E1205 20:06:47.172562 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:47 crc kubenswrapper[4885]: E1205 20:06:47.173041 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.243386 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.243423 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.243433 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.243448 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.243459 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:47Z","lastTransitionTime":"2025-12-05T20:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.346334 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.346390 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.346399 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.346412 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.346423 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:47Z","lastTransitionTime":"2025-12-05T20:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.448928 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.448986 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.449003 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.449063 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.449081 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:47Z","lastTransitionTime":"2025-12-05T20:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.552396 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.552457 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.552478 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.552512 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.552550 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:47Z","lastTransitionTime":"2025-12-05T20:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.654709 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.654764 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.654778 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.654799 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.654813 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:47Z","lastTransitionTime":"2025-12-05T20:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.669759 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.669809 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.669820 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.669840 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.669858 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:47Z","lastTransitionTime":"2025-12-05T20:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:47 crc kubenswrapper[4885]: E1205 20:06:47.683409 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.688114 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.688172 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.688198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.688230 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.688251 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:47Z","lastTransitionTime":"2025-12-05T20:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:47 crc kubenswrapper[4885]: E1205 20:06:47.709143 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.714882 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.714981 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.715006 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.715069 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.715083 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:47Z","lastTransitionTime":"2025-12-05T20:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:47 crc kubenswrapper[4885]: E1205 20:06:47.734603 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.738889 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.738930 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.738940 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.738955 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.738968 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:47Z","lastTransitionTime":"2025-12-05T20:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:47 crc kubenswrapper[4885]: E1205 20:06:47.758087 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.762714 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.762751 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.762762 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.762780 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.762795 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:47Z","lastTransitionTime":"2025-12-05T20:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:47 crc kubenswrapper[4885]: E1205 20:06:47.781388 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:47 crc kubenswrapper[4885]: E1205 20:06:47.781613 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.783214 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.783302 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.783329 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.783357 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.783465 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:47Z","lastTransitionTime":"2025-12-05T20:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.887119 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.887186 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.887202 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.887225 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.887246 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:47Z","lastTransitionTime":"2025-12-05T20:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.989420 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.989473 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.989484 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.989500 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:47 crc kubenswrapper[4885]: I1205 20:06:47.989512 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:47Z","lastTransitionTime":"2025-12-05T20:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.091207 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.091279 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.091304 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.091334 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.091355 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:48Z","lastTransitionTime":"2025-12-05T20:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.194566 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.194627 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.194684 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.194713 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.194736 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:48Z","lastTransitionTime":"2025-12-05T20:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.298464 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.298560 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.298582 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.298605 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.298624 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:48Z","lastTransitionTime":"2025-12-05T20:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.401413 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.401458 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.401467 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.401481 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.401493 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:48Z","lastTransitionTime":"2025-12-05T20:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.504390 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.504443 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.504455 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.504476 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.504490 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:48Z","lastTransitionTime":"2025-12-05T20:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.606584 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.606650 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.606670 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.606695 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.606714 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:48Z","lastTransitionTime":"2025-12-05T20:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.709552 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.709883 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.710046 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.710151 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.710232 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:48Z","lastTransitionTime":"2025-12-05T20:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.813959 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.813998 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.814016 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.814046 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.814054 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:48Z","lastTransitionTime":"2025-12-05T20:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.918497 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.918912 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.919102 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.919445 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:48 crc kubenswrapper[4885]: I1205 20:06:48.919591 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:48Z","lastTransitionTime":"2025-12-05T20:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.023479 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.023546 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.023571 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.023600 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.023621 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:49Z","lastTransitionTime":"2025-12-05T20:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.126241 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.126310 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.126332 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.126360 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.126382 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:49Z","lastTransitionTime":"2025-12-05T20:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.172487 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:49 crc kubenswrapper[4885]: E1205 20:06:49.172644 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.172920 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:49 crc kubenswrapper[4885]: E1205 20:06:49.173007 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.173220 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:49 crc kubenswrapper[4885]: E1205 20:06:49.173301 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.173622 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:49 crc kubenswrapper[4885]: E1205 20:06:49.173720 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.229666 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.229998 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.230204 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.230489 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.230705 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:49Z","lastTransitionTime":"2025-12-05T20:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.334504 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.334539 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.334550 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.334566 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.334577 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:49Z","lastTransitionTime":"2025-12-05T20:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.436356 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.436420 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.436438 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.436461 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.436479 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:49Z","lastTransitionTime":"2025-12-05T20:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.538827 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.538876 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.538886 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.538902 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.538914 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:49Z","lastTransitionTime":"2025-12-05T20:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.641734 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.642647 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.642810 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.642962 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.643148 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:49Z","lastTransitionTime":"2025-12-05T20:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.747060 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.747091 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.747108 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.747121 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.747129 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:49Z","lastTransitionTime":"2025-12-05T20:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.850402 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.850461 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.850478 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.850501 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.850518 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:49Z","lastTransitionTime":"2025-12-05T20:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.952610 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.952657 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.952668 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.952687 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:49 crc kubenswrapper[4885]: I1205 20:06:49.952699 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:49Z","lastTransitionTime":"2025-12-05T20:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.055444 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.055494 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.055510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.055532 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.055548 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:50Z","lastTransitionTime":"2025-12-05T20:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.158892 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.158956 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.158990 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.159060 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.159091 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:50Z","lastTransitionTime":"2025-12-05T20:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.261235 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.261278 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.261294 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.261329 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.261346 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:50Z","lastTransitionTime":"2025-12-05T20:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.363590 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.363627 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.363638 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.363654 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.363667 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:50Z","lastTransitionTime":"2025-12-05T20:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.466328 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.466378 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.466397 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.466422 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.466441 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:50Z","lastTransitionTime":"2025-12-05T20:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.570133 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.570204 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.570225 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.570252 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.570273 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:50Z","lastTransitionTime":"2025-12-05T20:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.672879 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.672934 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.672954 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.672980 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.673002 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:50Z","lastTransitionTime":"2025-12-05T20:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.776557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.776616 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.776640 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.776666 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.776686 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:50Z","lastTransitionTime":"2025-12-05T20:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.880318 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.880395 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.880418 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.880446 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.880467 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:50Z","lastTransitionTime":"2025-12-05T20:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.983374 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.983445 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.983466 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.983528 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:50 crc kubenswrapper[4885]: I1205 20:06:50.983551 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:50Z","lastTransitionTime":"2025-12-05T20:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.086721 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.086787 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.086810 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.086836 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.086857 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:51Z","lastTransitionTime":"2025-12-05T20:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.171911 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.172008 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.172055 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.171928 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:51 crc kubenswrapper[4885]: E1205 20:06:51.172204 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:51 crc kubenswrapper[4885]: E1205 20:06:51.172362 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:51 crc kubenswrapper[4885]: E1205 20:06:51.172470 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:51 crc kubenswrapper[4885]: E1205 20:06:51.172661 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.189203 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.189268 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.189304 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.189333 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.189354 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:51Z","lastTransitionTime":"2025-12-05T20:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.292628 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.292688 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.292706 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.292735 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.292754 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:51Z","lastTransitionTime":"2025-12-05T20:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.396752 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.397160 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.397327 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.397532 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.397695 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:51Z","lastTransitionTime":"2025-12-05T20:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.501147 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.501603 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.501976 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.502253 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.502403 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:51Z","lastTransitionTime":"2025-12-05T20:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.606823 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.606884 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.606902 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.606930 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.606950 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:51Z","lastTransitionTime":"2025-12-05T20:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.710248 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.710325 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.710350 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.710380 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.710401 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:51Z","lastTransitionTime":"2025-12-05T20:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.814547 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.814610 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.814628 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.814653 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.814671 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:51Z","lastTransitionTime":"2025-12-05T20:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.918543 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.918996 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.919212 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.919415 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:51 crc kubenswrapper[4885]: I1205 20:06:51.919579 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:51Z","lastTransitionTime":"2025-12-05T20:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.022689 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.022772 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.022796 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.022828 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.022852 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:52Z","lastTransitionTime":"2025-12-05T20:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.125571 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.125635 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.125653 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.125678 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.125698 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:52Z","lastTransitionTime":"2025-12-05T20:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.229694 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.229753 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.229771 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.229798 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.229829 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:52Z","lastTransitionTime":"2025-12-05T20:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.334450 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.334523 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.334546 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.334575 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.334597 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:52Z","lastTransitionTime":"2025-12-05T20:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.437586 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.437645 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.437668 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.437696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.437716 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:52Z","lastTransitionTime":"2025-12-05T20:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.540502 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.540540 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.540551 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.540566 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.540575 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:52Z","lastTransitionTime":"2025-12-05T20:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.642746 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.642790 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.642803 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.642820 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.642831 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:52Z","lastTransitionTime":"2025-12-05T20:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.745482 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.745544 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.745567 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.745593 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.745610 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:52Z","lastTransitionTime":"2025-12-05T20:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.848818 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.848880 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.848898 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.848923 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.848942 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:52Z","lastTransitionTime":"2025-12-05T20:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.952140 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.952215 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.952239 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.952271 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:52 crc kubenswrapper[4885]: I1205 20:06:52.952297 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:52Z","lastTransitionTime":"2025-12-05T20:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.055075 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.055464 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.055619 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.055732 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.055885 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:53Z","lastTransitionTime":"2025-12-05T20:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.159157 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.159254 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.159272 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.159297 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.159313 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:53Z","lastTransitionTime":"2025-12-05T20:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.172121 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.172225 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.172300 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:53 crc kubenswrapper[4885]: E1205 20:06:53.172296 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.172119 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:53 crc kubenswrapper[4885]: E1205 20:06:53.172414 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:53 crc kubenswrapper[4885]: E1205 20:06:53.172530 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:53 crc kubenswrapper[4885]: E1205 20:06:53.172686 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.173722 4885 scope.go:117] "RemoveContainer" containerID="4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.262232 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.262414 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.262591 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.262747 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.262888 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:53Z","lastTransitionTime":"2025-12-05T20:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.366171 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.366233 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.366251 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.366274 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.366293 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:53Z","lastTransitionTime":"2025-12-05T20:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.469778 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.469834 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.469850 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.469873 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.469893 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:53Z","lastTransitionTime":"2025-12-05T20:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.572394 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.572429 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.572445 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.572460 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.572471 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:53Z","lastTransitionTime":"2025-12-05T20:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.666541 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovnkube-controller/2.log" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.668249 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerStarted","Data":"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d"} Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.670083 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.677998 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.678073 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.678090 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.678111 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.678128 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:53Z","lastTransitionTime":"2025-12-05T20:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.682133 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.694426 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.709380 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.733935 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.746628 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.761591 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.774536 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.780358 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.780398 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.780416 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.780432 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.780443 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:53Z","lastTransitionTime":"2025-12-05T20:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.793629 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23633e674cb5832d0d0815f1e0ef1b70ffa2e6c2d92c3fc60d46c9ff7d4cc9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:43Z\\\",\\\"message\\\":\\\"2025-12-05T20:05:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0a592032-b29a-4479-b0f7-86dba8fdbc9a\\\\n2025-12-05T20:05:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0a592032-b29a-4479-b0f7-86dba8fdbc9a to /host/opt/cni/bin/\\\\n2025-12-05T20:05:58Z [verbose] multus-daemon started\\\\n2025-12-05T20:05:58Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:06:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.805638 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.820667 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.831724 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.842106 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.852740 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.863599 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0b551c2-e21f-4c68-93e8-b3865710c748\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55d112bc62087d911c13b8a28f8d3d57d83b8a3946f4d5003592be953f5bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2407c87ee202205691e8650387a082757f38bbfc3271575f6936d1b25f81ecda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5203d8faff0bf21cb02982db400e7803cbbd1caa8febda97f8b0c4cea1dcc48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.876492 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.882068 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.882101 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.882113 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.882127 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.882138 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:53Z","lastTransitionTime":"2025-12-05T20:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.887806 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.904334 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 20:06:25.996152 6523 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:06:25.995830 6523 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1205 20:06:25.996191 6523 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cni\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.983908 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.983975 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.983988 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.984004 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:53 crc kubenswrapper[4885]: I1205 20:06:53.984029 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:53Z","lastTransitionTime":"2025-12-05T20:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.086895 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.086955 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.086966 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.086984 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.086998 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:54Z","lastTransitionTime":"2025-12-05T20:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.189405 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.189442 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.189453 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.189467 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.189478 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:54Z","lastTransitionTime":"2025-12-05T20:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.292108 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.292149 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.292198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.292218 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.292230 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:54Z","lastTransitionTime":"2025-12-05T20:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.394155 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.394193 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.394202 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.394216 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.394225 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:54Z","lastTransitionTime":"2025-12-05T20:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.497385 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.497436 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.497448 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.497465 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.497477 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:54Z","lastTransitionTime":"2025-12-05T20:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.599805 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.599875 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.599893 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.599919 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.599939 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:54Z","lastTransitionTime":"2025-12-05T20:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.674416 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovnkube-controller/3.log" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.675546 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovnkube-controller/2.log" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.679228 4885 generic.go:334] "Generic (PLEG): container finished" podID="86ae690a-3705-45ae-8816-da5f33d2105e" containerID="ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d" exitCode=1 Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.679295 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerDied","Data":"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d"} Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.679370 4885 scope.go:117] "RemoveContainer" containerID="4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.680277 4885 scope.go:117] "RemoveContainer" containerID="ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d" Dec 05 20:06:54 crc kubenswrapper[4885]: E1205 20:06:54.680551 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.703764 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.704186 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.704200 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.704216 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.704228 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:54Z","lastTransitionTime":"2025-12-05T20:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.704302 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.716231 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.726965 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.737974 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.748342 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.790523 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.809172 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.809218 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.809230 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.809247 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.809263 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:54Z","lastTransitionTime":"2025-12-05T20:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.809616 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.823570 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23633e674cb5832d0d0815f1e0ef1b70ffa2e6c2d92c3fc60d46c9ff7d4cc9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:43Z\\\",\\\"message\\\":\\\"2025-12-05T20:05:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0a592032-b29a-4479-b0f7-86dba8fdbc9a\\\\n2025-12-05T20:05:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0a592032-b29a-4479-b0f7-86dba8fdbc9a to /host/opt/cni/bin/\\\\n2025-12-05T20:05:58Z [verbose] multus-daemon started\\\\n2025-12-05T20:05:58Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:06:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.840965 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.858143 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.876354 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.892883 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.908898 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.911605 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.911828 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.912102 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.912319 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.912500 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:54Z","lastTransitionTime":"2025-12-05T20:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.938340 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 20:06:25.996152 6523 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:06:25.995830 6523 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1205 20:06:25.996191 6523 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cni\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:54Z\\\",\\\"message\\\":\\\"ster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:06:54.173011 6875 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z]\\\\nI1205 20:06:54.172953 6875 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"d937b3b3-82c3-4791-9a66-41b9fed53e9d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Router\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.953421 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0b551c2-e21f-4c68-93e8-b3865710c748\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55d112bc62087d911c13b8a28f8d3d57d83b8a3946f4d5003592be953f5bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2407c87ee202205691e8650387a082757f38bbfc3271575f6936d1b25f81ecda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5203d8faff0bf21cb02982db400e7803cbbd1caa8febda97f8b0c4cea1dcc48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.965758 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:54 crc kubenswrapper[4885]: I1205 20:06:54.976447 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.013995 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.014094 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.014114 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.014133 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.014146 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:55Z","lastTransitionTime":"2025-12-05T20:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.116792 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.116827 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.116836 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.116869 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.116880 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:55Z","lastTransitionTime":"2025-12-05T20:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.172160 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.172208 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.172212 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:55 crc kubenswrapper[4885]: E1205 20:06:55.172378 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:55 crc kubenswrapper[4885]: E1205 20:06:55.172480 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.172495 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:55 crc kubenswrapper[4885]: E1205 20:06:55.172638 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:55 crc kubenswrapper[4885]: E1205 20:06:55.172739 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.194602 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.214087 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.219345 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.219403 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.219420 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.219443 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.219461 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:55Z","lastTransitionTime":"2025-12-05T20:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.245998 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4264467503bca6b1d26f47b9817e8efe327fdef45f580cbf9f30bf8f5e181e32\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:26Z\\\",\\\"message\\\":\\\"Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 20:06:25.996152 6523 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:06:25.995830 6523 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1205 20:06:25.996191 6523 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cni\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:54Z\\\",\\\"message\\\":\\\"ster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:06:54.173011 6875 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z]\\\\nI1205 20:06:54.172953 6875 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"d937b3b3-82c3-4791-9a66-41b9fed53e9d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Router\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.267451 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0b551c2-e21f-4c68-93e8-b3865710c748\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55d112bc62087d911c13b8a28f8d3d57d83b8a3946f4d5003592be953f5bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2407c87ee202205691e8650387a082757f38bbfc3271575f6936d1b25f81ecda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5203d8faff0bf21cb02982db400e7803cbbd1caa8febda97f8b0c4cea1dcc48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.289203 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.307550 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.322361 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.322417 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.322432 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.322453 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.322467 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:55Z","lastTransitionTime":"2025-12-05T20:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.333278 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.347359 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.366383 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.387209 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.403340 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23633e674cb5832d0d0815f1e0ef1b70ffa2e6c2d92c3fc60d46c9ff7d4cc9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:43Z\\\",\\\"message\\\":\\\"2025-12-05T20:05:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0a592032-b29a-4479-b0f7-86dba8fdbc9a\\\\n2025-12-05T20:05:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0a592032-b29a-4479-b0f7-86dba8fdbc9a to /host/opt/cni/bin/\\\\n2025-12-05T20:05:58Z [verbose] multus-daemon started\\\\n2025-12-05T20:05:58Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:06:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.415258 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.424456 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.424496 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.424507 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.424530 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.424543 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:55Z","lastTransitionTime":"2025-12-05T20:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.428443 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.442964 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.458833 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.471399 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.485698 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.528252 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.528294 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.528304 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.528317 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.528327 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:55Z","lastTransitionTime":"2025-12-05T20:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.630437 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.630479 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.630490 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.630508 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.630522 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:55Z","lastTransitionTime":"2025-12-05T20:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.686667 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovnkube-controller/3.log" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.693149 4885 scope.go:117] "RemoveContainer" containerID="ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d" Dec 05 20:06:55 crc kubenswrapper[4885]: E1205 20:06:55.693420 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.708461 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.727876 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.734948 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.734995 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.735013 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.735063 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.735080 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:55Z","lastTransitionTime":"2025-12-05T20:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.753073 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.772621 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.791881 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.804100 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.828698 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:54Z\\\",\\\"message\\\":\\\"ster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:06:54.173011 6875 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z]\\\\nI1205 20:06:54.172953 6875 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"d937b3b3-82c3-4791-9a66-41b9fed53e9d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Router\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.838432 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.838471 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.838481 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.838495 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.838506 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:55Z","lastTransitionTime":"2025-12-05T20:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.844582 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0b551c2-e21f-4c68-93e8-b3865710c748\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55d112bc62087d911c13b8a28f8d3d57d83b8a3946f4d5003592be953f5bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2407c87ee202205691e8650387a082757f38bbfc3271575f6936d1b25f81ecda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5203d8faff0bf21cb02982db400e7803cbbd1caa8febda97f8b0c4cea1dcc48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.859215 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.872751 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.895643 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.909483 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.923432 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.937376 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23633e674cb5832d0d0815f1e0ef1b70ffa2e6c2d92c3fc60d46c9ff7d4cc9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:43Z\\\",\\\"message\\\":\\\"2025-12-05T20:05:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0a592032-b29a-4479-b0f7-86dba8fdbc9a\\\\n2025-12-05T20:05:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0a592032-b29a-4479-b0f7-86dba8fdbc9a to /host/opt/cni/bin/\\\\n2025-12-05T20:05:58Z [verbose] multus-daemon started\\\\n2025-12-05T20:05:58Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:06:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.940999 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.941056 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.941068 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.941083 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.941094 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:55Z","lastTransitionTime":"2025-12-05T20:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.949939 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.964667 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:55 crc kubenswrapper[4885]: I1205 20:06:55.980040 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.043478 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.043525 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.043538 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.043558 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.043574 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:56Z","lastTransitionTime":"2025-12-05T20:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.146435 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.146474 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.146482 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.146497 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.146508 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:56Z","lastTransitionTime":"2025-12-05T20:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.249105 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.249144 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.249156 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.249171 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.249181 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:56Z","lastTransitionTime":"2025-12-05T20:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.351995 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.352082 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.352100 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.352122 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.352139 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:56Z","lastTransitionTime":"2025-12-05T20:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.455150 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.455212 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.455239 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.455267 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.455287 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:56Z","lastTransitionTime":"2025-12-05T20:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.558969 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.559181 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.559213 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.559281 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.559304 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:56Z","lastTransitionTime":"2025-12-05T20:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.662387 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.662432 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.662449 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.662501 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.662520 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:56Z","lastTransitionTime":"2025-12-05T20:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.765905 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.765969 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.765990 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.766014 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.766072 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:56Z","lastTransitionTime":"2025-12-05T20:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.869205 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.869279 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.869305 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.869337 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.869358 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:56Z","lastTransitionTime":"2025-12-05T20:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.971985 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.972058 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.972071 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.972089 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:56 crc kubenswrapper[4885]: I1205 20:06:56.972101 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:56Z","lastTransitionTime":"2025-12-05T20:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.075533 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.075612 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.075638 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.075667 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.075691 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:57Z","lastTransitionTime":"2025-12-05T20:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.172375 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.172439 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.172487 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.172538 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.172541 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.172686 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.172776 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.172833 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.178799 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.178865 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.178887 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.178910 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.178927 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:57Z","lastTransitionTime":"2025-12-05T20:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.230997 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.231192 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.231237 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:01.231194856 +0000 UTC m=+146.528010577 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.231308 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.231387 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.231480 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:08:01.231449924 +0000 UTC m=+146.528265655 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.231383 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.231549 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.231556 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.231579 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.231616 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.231619 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.231654 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:08:01.23162661 +0000 UTC m=+146.528442321 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.231689 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:08:01.231668892 +0000 UTC m=+146.528484663 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.231815 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.231863 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.231885 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.231978 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:08:01.231954181 +0000 UTC m=+146.528769892 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.282885 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.282982 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.283004 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.283086 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.283109 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:57Z","lastTransitionTime":"2025-12-05T20:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.386082 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.386132 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.386151 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.386177 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.386195 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:57Z","lastTransitionTime":"2025-12-05T20:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.490420 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.490496 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.490518 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.490544 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.490561 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:57Z","lastTransitionTime":"2025-12-05T20:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.593409 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.593479 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.593503 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.593540 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.593618 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:57Z","lastTransitionTime":"2025-12-05T20:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.697763 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.697818 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.697840 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.697870 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.697891 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:57Z","lastTransitionTime":"2025-12-05T20:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.801137 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.801202 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.801218 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.801241 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.801260 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:57Z","lastTransitionTime":"2025-12-05T20:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.904157 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.904210 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.904224 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.904240 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.904253 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:57Z","lastTransitionTime":"2025-12-05T20:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.920546 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.920627 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.920660 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.920691 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.920714 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:57Z","lastTransitionTime":"2025-12-05T20:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.934776 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.939874 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.939906 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.939914 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.939927 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.939939 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:57Z","lastTransitionTime":"2025-12-05T20:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.952922 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.956376 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.956434 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.956451 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.956473 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.956489 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:57Z","lastTransitionTime":"2025-12-05T20:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:57 crc kubenswrapper[4885]: E1205 20:06:57.977692 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.982269 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.982326 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.982344 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.982373 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:57 crc kubenswrapper[4885]: I1205 20:06:57.982391 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:57Z","lastTransitionTime":"2025-12-05T20:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:58 crc kubenswrapper[4885]: E1205 20:06:57.999904 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.003468 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.003509 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.003517 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.003530 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.003540 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:58Z","lastTransitionTime":"2025-12-05T20:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:58 crc kubenswrapper[4885]: E1205 20:06:58.018170 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:06:58 crc kubenswrapper[4885]: E1205 20:06:58.018313 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.019966 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.020007 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.020048 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.020068 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.020080 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:58Z","lastTransitionTime":"2025-12-05T20:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.122422 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.122476 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.122486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.122502 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.122513 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:58Z","lastTransitionTime":"2025-12-05T20:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.225635 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.225683 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.225702 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.225757 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.225774 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:58Z","lastTransitionTime":"2025-12-05T20:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.328179 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.328216 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.328225 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.328241 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.328252 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:58Z","lastTransitionTime":"2025-12-05T20:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.431230 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.431274 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.431285 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.431301 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.431312 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:58Z","lastTransitionTime":"2025-12-05T20:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.533871 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.533929 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.533940 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.533960 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.533971 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:58Z","lastTransitionTime":"2025-12-05T20:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.636884 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.636958 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.636973 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.636999 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.637014 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:58Z","lastTransitionTime":"2025-12-05T20:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.740216 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.740283 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.740299 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.740330 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.740351 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:58Z","lastTransitionTime":"2025-12-05T20:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.843353 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.843423 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.843448 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.843476 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.843497 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:58Z","lastTransitionTime":"2025-12-05T20:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.945977 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.946051 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.946069 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.946091 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:58 crc kubenswrapper[4885]: I1205 20:06:58.946107 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:58Z","lastTransitionTime":"2025-12-05T20:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.048527 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.048570 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.048580 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.048650 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.048663 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:59Z","lastTransitionTime":"2025-12-05T20:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.151326 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.151367 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.151381 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.151398 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.151408 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:59Z","lastTransitionTime":"2025-12-05T20:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.171971 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.172136 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.172136 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:06:59 crc kubenswrapper[4885]: E1205 20:06:59.172322 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.172407 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:06:59 crc kubenswrapper[4885]: E1205 20:06:59.172583 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:06:59 crc kubenswrapper[4885]: E1205 20:06:59.172914 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:06:59 crc kubenswrapper[4885]: E1205 20:06:59.172982 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.254600 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.254659 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.254680 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.254708 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.254732 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:59Z","lastTransitionTime":"2025-12-05T20:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.357820 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.357888 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.357911 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.357940 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.357960 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:59Z","lastTransitionTime":"2025-12-05T20:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.460495 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.460556 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.460572 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.460594 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.460612 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:59Z","lastTransitionTime":"2025-12-05T20:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.562530 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.562592 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.562608 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.562631 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.562648 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:59Z","lastTransitionTime":"2025-12-05T20:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.665454 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.665536 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.665560 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.665627 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.665647 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:59Z","lastTransitionTime":"2025-12-05T20:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.769323 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.769383 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.769400 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.769425 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.769443 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:59Z","lastTransitionTime":"2025-12-05T20:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.872504 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.872594 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.872622 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.872651 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.872670 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:59Z","lastTransitionTime":"2025-12-05T20:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.976086 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.976164 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.976188 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.976310 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:06:59 crc kubenswrapper[4885]: I1205 20:06:59.976338 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:06:59Z","lastTransitionTime":"2025-12-05T20:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.079717 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.079797 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.079817 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.079854 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.079891 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:00Z","lastTransitionTime":"2025-12-05T20:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.182917 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.182984 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.183002 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.183065 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.183085 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:00Z","lastTransitionTime":"2025-12-05T20:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.285963 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.286049 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.286067 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.286092 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.286109 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:00Z","lastTransitionTime":"2025-12-05T20:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.388858 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.388929 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.388973 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.389006 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.389067 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:00Z","lastTransitionTime":"2025-12-05T20:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.491532 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.491592 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.491608 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.491634 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.491652 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:00Z","lastTransitionTime":"2025-12-05T20:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.594431 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.594479 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.594496 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.594519 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.594537 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:00Z","lastTransitionTime":"2025-12-05T20:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.697911 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.697968 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.697987 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.698011 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.698052 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:00Z","lastTransitionTime":"2025-12-05T20:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.801601 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.801655 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.801673 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.801696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.801713 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:00Z","lastTransitionTime":"2025-12-05T20:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.905127 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.905199 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.905219 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.905249 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:00 crc kubenswrapper[4885]: I1205 20:07:00.905272 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:00Z","lastTransitionTime":"2025-12-05T20:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.008412 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.008476 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.008499 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.008531 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.008553 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:01Z","lastTransitionTime":"2025-12-05T20:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.111666 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.111721 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.111739 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.111764 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.111781 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:01Z","lastTransitionTime":"2025-12-05T20:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.171920 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.172112 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.172175 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.172226 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:01 crc kubenswrapper[4885]: E1205 20:07:01.172300 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:01 crc kubenswrapper[4885]: E1205 20:07:01.172424 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:01 crc kubenswrapper[4885]: E1205 20:07:01.172637 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:01 crc kubenswrapper[4885]: E1205 20:07:01.172799 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.214806 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.214871 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.214893 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.214919 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.214940 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:01Z","lastTransitionTime":"2025-12-05T20:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.318570 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.318639 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.318665 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.318695 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.318716 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:01Z","lastTransitionTime":"2025-12-05T20:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.420894 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.420945 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.420960 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.420978 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.420993 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:01Z","lastTransitionTime":"2025-12-05T20:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.523707 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.523814 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.523838 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.523864 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.523886 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:01Z","lastTransitionTime":"2025-12-05T20:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.629086 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.629142 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.629155 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.629179 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.629194 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:01Z","lastTransitionTime":"2025-12-05T20:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.732114 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.732176 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.732187 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.732215 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.732227 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:01Z","lastTransitionTime":"2025-12-05T20:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.835829 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.835900 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.835923 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.835952 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.835974 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:01Z","lastTransitionTime":"2025-12-05T20:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.939930 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.939986 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.940002 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.940072 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:01 crc kubenswrapper[4885]: I1205 20:07:01.940111 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:01Z","lastTransitionTime":"2025-12-05T20:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.042761 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.042847 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.042885 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.042917 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.042941 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:02Z","lastTransitionTime":"2025-12-05T20:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.146193 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.146261 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.146288 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.146317 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.146338 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:02Z","lastTransitionTime":"2025-12-05T20:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.249688 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.249953 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.249987 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.250047 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.250073 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:02Z","lastTransitionTime":"2025-12-05T20:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.353576 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.353642 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.353658 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.353681 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.353698 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:02Z","lastTransitionTime":"2025-12-05T20:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.456680 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.456746 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.456762 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.456787 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.456803 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:02Z","lastTransitionTime":"2025-12-05T20:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.559972 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.560073 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.560096 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.560118 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.560137 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:02Z","lastTransitionTime":"2025-12-05T20:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.663634 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.663696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.663720 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.663750 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.663768 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:02Z","lastTransitionTime":"2025-12-05T20:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.766098 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.766150 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.766166 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.766186 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.766200 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:02Z","lastTransitionTime":"2025-12-05T20:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.869653 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.869723 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.869746 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.869772 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.869788 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:02Z","lastTransitionTime":"2025-12-05T20:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.972209 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.972317 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.972341 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.972373 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:02 crc kubenswrapper[4885]: I1205 20:07:02.972402 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:02Z","lastTransitionTime":"2025-12-05T20:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.076000 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.076108 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.076126 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.076150 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.076172 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:03Z","lastTransitionTime":"2025-12-05T20:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.171953 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.171970 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.172129 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:03 crc kubenswrapper[4885]: E1205 20:07:03.172443 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.172580 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:03 crc kubenswrapper[4885]: E1205 20:07:03.172615 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:03 crc kubenswrapper[4885]: E1205 20:07:03.172865 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:03 crc kubenswrapper[4885]: E1205 20:07:03.173282 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.179556 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.179615 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.179637 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.179668 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.179685 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:03Z","lastTransitionTime":"2025-12-05T20:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.282934 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.283065 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.283083 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.283110 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.283129 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:03Z","lastTransitionTime":"2025-12-05T20:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.386402 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.386449 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.386471 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.386569 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.386580 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:03Z","lastTransitionTime":"2025-12-05T20:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.489651 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.489689 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.489700 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.489716 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.489728 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:03Z","lastTransitionTime":"2025-12-05T20:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.592795 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.592884 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.592903 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.592930 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.592949 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:03Z","lastTransitionTime":"2025-12-05T20:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.696799 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.696866 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.696883 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.696910 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.696929 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:03Z","lastTransitionTime":"2025-12-05T20:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.799864 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.799963 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.799976 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.799995 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.800012 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:03Z","lastTransitionTime":"2025-12-05T20:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.903724 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.903828 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.903852 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.903897 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:03 crc kubenswrapper[4885]: I1205 20:07:03.903960 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:03Z","lastTransitionTime":"2025-12-05T20:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.006598 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.006670 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.006682 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.006702 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.006716 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:04Z","lastTransitionTime":"2025-12-05T20:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.109856 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.109894 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.109905 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.109919 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.109931 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:04Z","lastTransitionTime":"2025-12-05T20:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.212364 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.212410 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.212421 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.212440 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.212451 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:04Z","lastTransitionTime":"2025-12-05T20:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.315104 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.315148 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.315157 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.315170 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.315179 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:04Z","lastTransitionTime":"2025-12-05T20:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.417642 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.417708 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.417727 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.417762 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.418012 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:04Z","lastTransitionTime":"2025-12-05T20:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.521238 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.521274 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.521286 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.521302 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.521316 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:04Z","lastTransitionTime":"2025-12-05T20:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.622971 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.622998 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.623006 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.623031 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.623040 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:04Z","lastTransitionTime":"2025-12-05T20:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.725361 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.725429 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.725453 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.725481 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.725499 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:04Z","lastTransitionTime":"2025-12-05T20:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.828567 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.828600 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.828610 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.828625 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.828667 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:04Z","lastTransitionTime":"2025-12-05T20:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.932492 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.932532 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.932547 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.932563 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:04 crc kubenswrapper[4885]: I1205 20:07:04.932575 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:04Z","lastTransitionTime":"2025-12-05T20:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.035753 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.035805 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.035822 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.035844 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.035863 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:05Z","lastTransitionTime":"2025-12-05T20:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.139114 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.139164 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.139178 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.139197 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.139211 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:05Z","lastTransitionTime":"2025-12-05T20:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.172439 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:05 crc kubenswrapper[4885]: E1205 20:07:05.172772 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.172970 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.173089 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:05 crc kubenswrapper[4885]: E1205 20:07:05.173101 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.173200 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:05 crc kubenswrapper[4885]: E1205 20:07:05.173311 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:05 crc kubenswrapper[4885]: E1205 20:07:05.173505 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.195510 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b86010ce1bda5bfa4df674b9f8502d88467d2b22453c50bd4e46d4078a196ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.216777 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c589d2fd85474d9638cff04a27e918020c0af7b7d91d97e5b9812bb862eb2be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b820cab196c3ad014ebd069c456efc9e952161ec8e138708804d83875c2fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.239894 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmtwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6c25e90-efcc-490c-afef-970c3a62c809\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23633e674cb5832d0d0815f1e0ef1b70ffa2e6c2d92c3fc60d46c9ff7d4cc9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:43Z\\\",\\\"message\\\":\\\"2025-12-05T20:05:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0a592032-b29a-4479-b0f7-86dba8fdbc9a\\\\n2025-12-05T20:05:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0a592032-b29a-4479-b0f7-86dba8fdbc9a to /host/opt/cni/bin/\\\\n2025-12-05T20:05:58Z [verbose] multus-daemon started\\\\n2025-12-05T20:05:58Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:06:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qd7qn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmtwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.242116 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.242188 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.242206 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.242986 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.243099 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:05Z","lastTransitionTime":"2025-12-05T20:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.259225 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8bd00a1-3879-4791-8e78-150f2a0bf522\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d7f8e0dde54c548075228face11f27ad4cce9c31f29e607e8056a2ee0895c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5125903b9bf26f704c07e7bd1704545a328d28ad14984a5d3183c0b44538fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr9g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hhxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.282837 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e003c8d-46a7-4194-b63b-100b1d5af08e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:05:47.615627 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:05:47.620127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2308640064/tls.crt::/tmp/serving-cert-2308640064/tls.key\\\\\\\"\\\\nI1205 20:05:53.173724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:05:53.177825 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:05:53.177916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:05:53.177988 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:05:53.178048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:05:53.184147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:05:53.184247 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:05:53.184304 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:05:53.184330 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:05:53.184354 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:05:53.184379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:05:53.184570 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:05:53.192549 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.303596 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.317924 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-msl9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c5536c-62ec-4ca4-938c-1e0322c676b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ba49a192d2aa1f506d1f9d631284fb4cf1f29f29fd58ded75127c9e91e4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbwlb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-msl9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.332111 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c0a952-e24a-49c2-b4ba-e20be61b840d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4pf8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2jdj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.345290 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0b551c2-e21f-4c68-93e8-b3865710c748\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55d112bc62087d911c13b8a28f8d3d57d83b8a3946f4d5003592be953f5bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2407c87ee202205691e8650387a082757f38bbfc3271575f6936d1b25f81ecda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5203d8faff0bf21cb02982db400e7803cbbd1caa8febda97f8b0c4cea1dcc48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7d9f0dc0b49d774c2a451141f31fccc995ec95cc12dd72ea4e86b6a769cf709\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.346672 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.346709 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.346724 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.346744 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.346757 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:05Z","lastTransitionTime":"2025-12-05T20:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.358857 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.370328 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d0e156e3807c6b0b556dd2f12eb2f3545eb9c393924e6476b146929ecfb118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.393105 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86ae690a-3705-45ae-8816-da5f33d2105e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:06:54Z\\\",\\\"message\\\":\\\"ster) 0 (per node) and 0 (template) load balancers\\\\nF1205 20:06:54.173011 6875 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:06:54Z is after 2025-08-24T17:21:41Z]\\\\nI1205 20:06:54.172953 6875 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"d937b3b3-82c3-4791-9a66-41b9fed53e9d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Router\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dcsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wx7m6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.407864 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fb5064d-704a-4c90-a3bd-154b18819ea8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d280fa78e9836e2ff7c581c66918e30e221878eb3c3313747372956dfdb7cfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773309b4e65e9aeda4481e7caaa1e70a182e555276800e69a0efd182e0e6c617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609dee7c742a09043ea37e80a046ed47448f63c8503e069a8f29d2cb3adc0735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.424295 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.437340 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2b6b12223ad91b1352cb3f0b872e307c321dca3e31eda59fd6d1be760a1f14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkflz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5m8lc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.452075 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.452144 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.452167 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.452199 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.452224 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:05Z","lastTransitionTime":"2025-12-05T20:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.457591 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15623dc-71c3-4ee6-9078-3980cada3660\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12e8675a5e59848a34d01098d3c614976936edd78e3e25f56ffac1d52973dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a11e93b73f24c21bfbd4bfb4370cb2088a3d9e55dcf0dd9ab7b14bb6c7d4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847a6329f693e33072baeee2ae3e088ca643f32d605b6434f74f4b7e043b64f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:05:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abdb9c03c908c45172d8819ef6d6f0008880e4e7ca52a9c3b58a97ca3634e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f139bad176339777aa0795acbecabf3c338cf39647a59735d9052107d5e38217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d06b93b05b7798978cfedc16d369bbf004060af46deb42617b49a131aa4b7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c13463ae57fcf8b0302932077b3e3eba090b29c61b2cd49cd034372ae7f34fe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75bfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:05:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5qh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.469416 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grvrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f80533-e916-4ba6-9c5d-86f0dbd6f521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a9162629578c9b9f38c959c25b9d91b7ea35a637b1ad7b4957cf72d29f75ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zgr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:06:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grvrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.555708 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.555776 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.555800 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.555830 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.555856 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:05Z","lastTransitionTime":"2025-12-05T20:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.658139 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.658205 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.658228 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.658259 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.658284 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:05Z","lastTransitionTime":"2025-12-05T20:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.760769 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.760816 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.760827 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.760843 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.760855 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:05Z","lastTransitionTime":"2025-12-05T20:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.864113 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.864175 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.864192 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.864215 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.864231 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:05Z","lastTransitionTime":"2025-12-05T20:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.967275 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.967717 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.967876 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.968105 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:05 crc kubenswrapper[4885]: I1205 20:07:05.968281 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:05Z","lastTransitionTime":"2025-12-05T20:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.071150 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.071766 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.071866 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.072004 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.072196 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:06Z","lastTransitionTime":"2025-12-05T20:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.172943 4885 scope.go:117] "RemoveContainer" containerID="ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d" Dec 05 20:07:06 crc kubenswrapper[4885]: E1205 20:07:06.173206 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.175861 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.175942 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.175967 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.175994 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.176075 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:06Z","lastTransitionTime":"2025-12-05T20:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.186822 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.279525 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.279582 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.279598 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.279620 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.279636 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:06Z","lastTransitionTime":"2025-12-05T20:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.382490 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.382568 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.382591 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.382624 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.382651 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:06Z","lastTransitionTime":"2025-12-05T20:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.485929 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.485999 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.486040 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.486066 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.486083 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:06Z","lastTransitionTime":"2025-12-05T20:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.589486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.589561 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.589585 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.589614 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.589636 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:06Z","lastTransitionTime":"2025-12-05T20:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.692616 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.692668 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.692684 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.692707 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.692725 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:06Z","lastTransitionTime":"2025-12-05T20:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.795802 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.795876 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.795894 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.795919 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.795936 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:06Z","lastTransitionTime":"2025-12-05T20:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.899428 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.899509 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.899565 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.899596 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:06 crc kubenswrapper[4885]: I1205 20:07:06.899620 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:06Z","lastTransitionTime":"2025-12-05T20:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.003233 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.003329 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.003351 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.003374 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.003391 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:07Z","lastTransitionTime":"2025-12-05T20:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.107189 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.107253 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.107270 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.107299 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.107315 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:07Z","lastTransitionTime":"2025-12-05T20:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.172701 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.172824 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.172822 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:07 crc kubenswrapper[4885]: E1205 20:07:07.172998 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.173066 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:07 crc kubenswrapper[4885]: E1205 20:07:07.173293 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:07 crc kubenswrapper[4885]: E1205 20:07:07.173457 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:07 crc kubenswrapper[4885]: E1205 20:07:07.173601 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.209815 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.209889 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.209917 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.209940 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.209957 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:07Z","lastTransitionTime":"2025-12-05T20:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.313310 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.313371 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.313394 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.313423 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.313443 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:07Z","lastTransitionTime":"2025-12-05T20:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.416415 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.416471 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.416490 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.416512 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.416529 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:07Z","lastTransitionTime":"2025-12-05T20:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.521247 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.521295 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.521313 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.521339 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.521356 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:07Z","lastTransitionTime":"2025-12-05T20:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.624618 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.624695 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.624715 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.624743 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.624762 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:07Z","lastTransitionTime":"2025-12-05T20:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.728590 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.728700 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.728726 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.728816 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.728842 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:07Z","lastTransitionTime":"2025-12-05T20:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.831733 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.831782 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.831795 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.831810 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.831822 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:07Z","lastTransitionTime":"2025-12-05T20:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.935780 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.935842 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.935858 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.935879 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:07 crc kubenswrapper[4885]: I1205 20:07:07.935891 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:07Z","lastTransitionTime":"2025-12-05T20:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.039088 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.039134 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.039148 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.039167 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.039180 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:08Z","lastTransitionTime":"2025-12-05T20:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.122616 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.122660 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.122671 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.122686 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.122698 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:08Z","lastTransitionTime":"2025-12-05T20:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:08 crc kubenswrapper[4885]: E1205 20:07:08.144481 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.149270 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.149362 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.149379 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.149396 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.149409 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:08Z","lastTransitionTime":"2025-12-05T20:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:08 crc kubenswrapper[4885]: E1205 20:07:08.168455 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.172989 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.173035 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.173046 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.173059 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.173069 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:08Z","lastTransitionTime":"2025-12-05T20:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:08 crc kubenswrapper[4885]: E1205 20:07:08.190842 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.196229 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.196336 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.196355 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.196408 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.196430 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:08Z","lastTransitionTime":"2025-12-05T20:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:08 crc kubenswrapper[4885]: E1205 20:07:08.220070 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.225277 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.225487 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.225619 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.225778 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.225927 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:08Z","lastTransitionTime":"2025-12-05T20:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:08 crc kubenswrapper[4885]: E1205 20:07:08.244278 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"947d01a1-b35b-4747-9479-3c70c2147f66\\\",\\\"systemUUID\\\":\\\"5edae59e-e3c2-4636-b1d3-4225cdddd2db\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:07:08Z is after 2025-08-24T17:21:41Z" Dec 05 20:07:08 crc kubenswrapper[4885]: E1205 20:07:08.244428 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.246540 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.246579 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.246589 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.246607 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.246621 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:08Z","lastTransitionTime":"2025-12-05T20:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.350746 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.350814 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.350831 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.350853 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.350870 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:08Z","lastTransitionTime":"2025-12-05T20:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.453762 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.453815 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.453832 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.453856 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.453875 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:08Z","lastTransitionTime":"2025-12-05T20:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.557149 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.557265 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.557288 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.557317 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.557339 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:08Z","lastTransitionTime":"2025-12-05T20:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.660654 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.660703 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.660722 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.660744 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.660760 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:08Z","lastTransitionTime":"2025-12-05T20:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.763091 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.763136 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.763152 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.763173 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.763191 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:08Z","lastTransitionTime":"2025-12-05T20:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.867496 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.867575 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.867634 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.867666 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.867687 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:08Z","lastTransitionTime":"2025-12-05T20:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.971901 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.972091 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.972122 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.972190 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:08 crc kubenswrapper[4885]: I1205 20:07:08.972212 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:08Z","lastTransitionTime":"2025-12-05T20:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.075663 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.076135 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.076150 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.076173 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.076190 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:09Z","lastTransitionTime":"2025-12-05T20:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.172722 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.172772 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:09 crc kubenswrapper[4885]: E1205 20:07:09.172903 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.172930 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:09 crc kubenswrapper[4885]: E1205 20:07:09.173153 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.173260 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:09 crc kubenswrapper[4885]: E1205 20:07:09.173369 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:09 crc kubenswrapper[4885]: E1205 20:07:09.173850 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.180322 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.180383 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.180404 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.180431 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.180453 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:09Z","lastTransitionTime":"2025-12-05T20:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.195440 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.284098 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.284164 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.284188 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.284218 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.284240 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:09Z","lastTransitionTime":"2025-12-05T20:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.387150 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.387187 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.387196 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.387210 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.387220 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:09Z","lastTransitionTime":"2025-12-05T20:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.489651 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.489685 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.489700 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.489717 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.489729 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:09Z","lastTransitionTime":"2025-12-05T20:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.592198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.592249 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.592257 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.592271 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.592279 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:09Z","lastTransitionTime":"2025-12-05T20:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.695120 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.695176 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.695194 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.695218 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.695235 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:09Z","lastTransitionTime":"2025-12-05T20:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.797686 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.797747 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.797764 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.797796 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.797818 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:09Z","lastTransitionTime":"2025-12-05T20:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.900603 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.900666 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.900683 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.900711 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:09 crc kubenswrapper[4885]: I1205 20:07:09.900728 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:09Z","lastTransitionTime":"2025-12-05T20:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.003672 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.003706 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.003715 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.003732 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.003741 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:10Z","lastTransitionTime":"2025-12-05T20:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.106078 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.106153 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.106177 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.106207 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.106231 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:10Z","lastTransitionTime":"2025-12-05T20:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.208865 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.208927 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.208943 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.208970 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.208984 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:10Z","lastTransitionTime":"2025-12-05T20:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.311660 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.311737 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.311758 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.311786 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.311808 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:10Z","lastTransitionTime":"2025-12-05T20:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.415165 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.415240 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.415267 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.415292 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.415310 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:10Z","lastTransitionTime":"2025-12-05T20:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.518391 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.518447 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.518457 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.518477 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.518498 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:10Z","lastTransitionTime":"2025-12-05T20:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.625702 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.625960 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.625977 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.626010 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.626093 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:10Z","lastTransitionTime":"2025-12-05T20:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.728490 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.728580 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.728594 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.728613 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.728801 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:10Z","lastTransitionTime":"2025-12-05T20:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.832185 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.832335 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.832371 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.832445 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.832465 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:10Z","lastTransitionTime":"2025-12-05T20:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.935346 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.935419 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.935441 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.935468 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:10 crc kubenswrapper[4885]: I1205 20:07:10.935488 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:10Z","lastTransitionTime":"2025-12-05T20:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.038890 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.039255 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.039470 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.039748 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.039988 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:11Z","lastTransitionTime":"2025-12-05T20:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.144009 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.144095 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.144113 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.144138 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.144157 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:11Z","lastTransitionTime":"2025-12-05T20:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.172400 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.172406 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.172473 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.172546 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:11 crc kubenswrapper[4885]: E1205 20:07:11.172778 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:11 crc kubenswrapper[4885]: E1205 20:07:11.173187 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:11 crc kubenswrapper[4885]: E1205 20:07:11.173299 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:11 crc kubenswrapper[4885]: E1205 20:07:11.173458 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.246938 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.247074 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.247091 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.247110 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.247122 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:11Z","lastTransitionTime":"2025-12-05T20:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.350231 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.350295 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.350312 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.350338 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.350361 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:11Z","lastTransitionTime":"2025-12-05T20:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.453628 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.453696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.453722 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.453754 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.453776 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:11Z","lastTransitionTime":"2025-12-05T20:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.557560 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.557637 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.557662 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.557692 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.557712 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:11Z","lastTransitionTime":"2025-12-05T20:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.661916 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.662074 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.662099 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.662123 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.662141 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:11Z","lastTransitionTime":"2025-12-05T20:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.765014 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.765101 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.765119 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.765142 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.765161 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:11Z","lastTransitionTime":"2025-12-05T20:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.868268 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.868390 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.868416 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.868439 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.868458 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:11Z","lastTransitionTime":"2025-12-05T20:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.971185 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.971248 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.971267 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.971290 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:11 crc kubenswrapper[4885]: I1205 20:07:11.971307 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:11Z","lastTransitionTime":"2025-12-05T20:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.074789 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.074865 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.074883 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.074910 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.074929 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:12Z","lastTransitionTime":"2025-12-05T20:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.178573 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.178658 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.178680 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.178706 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.178725 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:12Z","lastTransitionTime":"2025-12-05T20:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.280855 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.280925 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.280946 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.280969 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.280986 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:12Z","lastTransitionTime":"2025-12-05T20:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.384324 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.384385 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.384407 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.384438 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.384490 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:12Z","lastTransitionTime":"2025-12-05T20:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.487372 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.487439 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.487461 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.487486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.487503 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:12Z","lastTransitionTime":"2025-12-05T20:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.589678 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.589731 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.589748 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.589771 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.589787 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:12Z","lastTransitionTime":"2025-12-05T20:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.692162 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.692238 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.692265 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.692296 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.692319 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:12Z","lastTransitionTime":"2025-12-05T20:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.794984 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.795126 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.795153 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.795181 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.795199 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:12Z","lastTransitionTime":"2025-12-05T20:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.897995 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.898073 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.898090 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.898117 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:12 crc kubenswrapper[4885]: I1205 20:07:12.898156 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:12Z","lastTransitionTime":"2025-12-05T20:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.000219 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.000280 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.000290 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.000304 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.000314 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:13Z","lastTransitionTime":"2025-12-05T20:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.102732 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.102791 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.102818 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.102844 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.102864 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:13Z","lastTransitionTime":"2025-12-05T20:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.172623 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.172700 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:13 crc kubenswrapper[4885]: E1205 20:07:13.172813 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.172911 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:13 crc kubenswrapper[4885]: E1205 20:07:13.173047 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:13 crc kubenswrapper[4885]: E1205 20:07:13.173290 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.173323 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:13 crc kubenswrapper[4885]: E1205 20:07:13.173468 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.205497 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.205733 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.205802 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.205882 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.205959 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:13Z","lastTransitionTime":"2025-12-05T20:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.308618 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.309285 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.309350 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.309386 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.309410 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:13Z","lastTransitionTime":"2025-12-05T20:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.412917 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.413070 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.413090 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.413115 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.413132 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:13Z","lastTransitionTime":"2025-12-05T20:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.517084 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.517274 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.517308 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.517339 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.517358 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:13Z","lastTransitionTime":"2025-12-05T20:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.620764 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.620813 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.620830 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.620856 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.620878 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:13Z","lastTransitionTime":"2025-12-05T20:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.723944 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.724045 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.724065 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.724091 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.724109 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:13Z","lastTransitionTime":"2025-12-05T20:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.827623 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.827701 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.827725 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.827757 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.827781 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:13Z","lastTransitionTime":"2025-12-05T20:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.930679 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.930737 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.930754 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.930778 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:13 crc kubenswrapper[4885]: I1205 20:07:13.930796 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:13Z","lastTransitionTime":"2025-12-05T20:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.033972 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.034059 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.034077 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.034099 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.034115 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:14Z","lastTransitionTime":"2025-12-05T20:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.137694 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.137766 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.137783 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.137810 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.137828 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:14Z","lastTransitionTime":"2025-12-05T20:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.240463 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.240529 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.240553 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.240582 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.240605 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:14Z","lastTransitionTime":"2025-12-05T20:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.343864 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.343969 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.343992 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.344047 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.344070 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:14Z","lastTransitionTime":"2025-12-05T20:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.446861 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.446898 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.446906 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.446919 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.446929 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:14Z","lastTransitionTime":"2025-12-05T20:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.549499 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.549531 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.549540 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.549552 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.549561 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:14Z","lastTransitionTime":"2025-12-05T20:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.652551 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.652600 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.652611 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.652629 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.652641 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:14Z","lastTransitionTime":"2025-12-05T20:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.755533 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.755595 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.755607 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.755627 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.755639 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:14Z","lastTransitionTime":"2025-12-05T20:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.859076 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.859125 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.859137 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.859153 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.859168 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:14Z","lastTransitionTime":"2025-12-05T20:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.961955 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.962002 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.962015 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.962050 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:14 crc kubenswrapper[4885]: I1205 20:07:14.962066 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:14Z","lastTransitionTime":"2025-12-05T20:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.064729 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.064779 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.064795 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.064818 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.064837 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:15Z","lastTransitionTime":"2025-12-05T20:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.168252 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.168304 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.168634 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.168674 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.168863 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:15Z","lastTransitionTime":"2025-12-05T20:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.172585 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.172645 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.172685 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.172630 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:15 crc kubenswrapper[4885]: E1205 20:07:15.172924 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:15 crc kubenswrapper[4885]: E1205 20:07:15.173151 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:15 crc kubenswrapper[4885]: E1205 20:07:15.173255 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:15 crc kubenswrapper[4885]: E1205 20:07:15.173321 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.225498 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=77.225465558 podStartE2EDuration="1m17.225465558s" podCreationTimestamp="2025-12-05 20:05:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:07:15.193507639 +0000 UTC m=+100.490323310" watchObservedRunningTime="2025-12-05 20:07:15.225465558 +0000 UTC m=+100.522281249" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.225835 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=6.22582315 podStartE2EDuration="6.22582315s" podCreationTimestamp="2025-12-05 20:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:07:15.223349956 +0000 UTC m=+100.520165637" watchObservedRunningTime="2025-12-05 20:07:15.22582315 +0000 UTC m=+100.522638841" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.265468 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podStartSLOduration=78.265448554 podStartE2EDuration="1m18.265448554s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:07:15.251505958 +0000 UTC m=+100.548321619" watchObservedRunningTime="2025-12-05 20:07:15.265448554 +0000 UTC m=+100.562264215" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.272923 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.273012 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.273055 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.273088 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.273557 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:15Z","lastTransitionTime":"2025-12-05T20:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.279763 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-c5qh5" podStartSLOduration=78.279741221 podStartE2EDuration="1m18.279741221s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:07:15.26742954 +0000 UTC m=+100.564245281" watchObservedRunningTime="2025-12-05 20:07:15.279741221 +0000 UTC m=+100.576556882" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.327794 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-grvrw" podStartSLOduration=78.327773457 podStartE2EDuration="1m18.327773457s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:07:15.283503298 +0000 UTC m=+100.580318959" watchObservedRunningTime="2025-12-05 20:07:15.327773457 +0000 UTC m=+100.624589118" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.376461 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.376503 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.376517 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.376536 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.376552 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:15Z","lastTransitionTime":"2025-12-05T20:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.386186 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zmtwj" podStartSLOduration=78.386160389 podStartE2EDuration="1m18.386160389s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:07:15.383748958 +0000 UTC m=+100.680564629" watchObservedRunningTime="2025-12-05 20:07:15.386160389 +0000 UTC m=+100.682976070" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.401816 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hhxl" podStartSLOduration=78.401780031 podStartE2EDuration="1m18.401780031s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:07:15.40174333 +0000 UTC m=+100.698558991" watchObservedRunningTime="2025-12-05 20:07:15.401780031 +0000 UTC m=+100.698595692" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.420496 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=82.420474506 podStartE2EDuration="1m22.420474506s" podCreationTimestamp="2025-12-05 20:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:07:15.420102534 +0000 UTC m=+100.716918195" watchObservedRunningTime="2025-12-05 20:07:15.420474506 +0000 UTC m=+100.717290167" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.444927 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-msl9r" podStartSLOduration=78.444910503 podStartE2EDuration="1m18.444910503s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:07:15.444768768 +0000 UTC m=+100.741584429" watchObservedRunningTime="2025-12-05 20:07:15.444910503 +0000 UTC m=+100.741726174" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.470500 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=46.470487228 podStartE2EDuration="46.470487228s" podCreationTimestamp="2025-12-05 20:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:07:15.470159077 +0000 UTC m=+100.766974738" watchObservedRunningTime="2025-12-05 20:07:15.470487228 +0000 UTC m=+100.767302889" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.478730 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.478760 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.478770 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.478783 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.478793 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:15Z","lastTransitionTime":"2025-12-05T20:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.481708 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=9.481684812 podStartE2EDuration="9.481684812s" podCreationTimestamp="2025-12-05 20:07:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:07:15.481353511 +0000 UTC m=+100.778169182" watchObservedRunningTime="2025-12-05 20:07:15.481684812 +0000 UTC m=+100.778500473" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.581170 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.581215 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.581229 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.581250 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.581263 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:15Z","lastTransitionTime":"2025-12-05T20:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.683702 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.683763 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.683785 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.683811 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.683833 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:15Z","lastTransitionTime":"2025-12-05T20:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.737265 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs\") pod \"network-metrics-daemon-2jdj4\" (UID: \"a5c0a952-e24a-49c2-b4ba-e20be61b840d\") " pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:15 crc kubenswrapper[4885]: E1205 20:07:15.737517 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:07:15 crc kubenswrapper[4885]: E1205 20:07:15.737614 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs podName:a5c0a952-e24a-49c2-b4ba-e20be61b840d nodeName:}" failed. No retries permitted until 2025-12-05 20:08:19.737583227 +0000 UTC m=+165.034398928 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs") pod "network-metrics-daemon-2jdj4" (UID: "a5c0a952-e24a-49c2-b4ba-e20be61b840d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.786190 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.786608 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.786795 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.786959 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.787153 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:15Z","lastTransitionTime":"2025-12-05T20:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.890145 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.890222 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.890283 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.890318 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.890342 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:15Z","lastTransitionTime":"2025-12-05T20:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.993973 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.994080 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.994105 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.994133 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:15 crc kubenswrapper[4885]: I1205 20:07:15.994195 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:15Z","lastTransitionTime":"2025-12-05T20:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.097686 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.097792 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.097823 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.097852 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.097872 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:16Z","lastTransitionTime":"2025-12-05T20:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.200951 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.200996 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.201007 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.201045 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.201060 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:16Z","lastTransitionTime":"2025-12-05T20:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.304877 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.304952 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.304973 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.304999 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.305015 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:16Z","lastTransitionTime":"2025-12-05T20:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.408687 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.408751 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.408774 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.408801 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.408822 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:16Z","lastTransitionTime":"2025-12-05T20:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.511677 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.511727 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.511742 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.511767 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.511785 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:16Z","lastTransitionTime":"2025-12-05T20:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.615248 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.615385 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.615408 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.615436 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.615458 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:16Z","lastTransitionTime":"2025-12-05T20:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.718092 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.718139 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.718155 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.718182 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.718202 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:16Z","lastTransitionTime":"2025-12-05T20:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.820193 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.820246 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.820257 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.820279 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.820290 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:16Z","lastTransitionTime":"2025-12-05T20:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.922491 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.922526 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.922534 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.922546 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:16 crc kubenswrapper[4885]: I1205 20:07:16.922556 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:16Z","lastTransitionTime":"2025-12-05T20:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.025284 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.025345 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.025366 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.025391 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.025409 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:17Z","lastTransitionTime":"2025-12-05T20:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.128419 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.128462 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.128474 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.128491 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.128502 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:17Z","lastTransitionTime":"2025-12-05T20:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.172259 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.172270 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.172338 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.172550 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:17 crc kubenswrapper[4885]: E1205 20:07:17.172541 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:17 crc kubenswrapper[4885]: E1205 20:07:17.172635 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:17 crc kubenswrapper[4885]: E1205 20:07:17.172701 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:17 crc kubenswrapper[4885]: E1205 20:07:17.172754 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.230909 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.230956 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.230965 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.230981 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.230991 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:17Z","lastTransitionTime":"2025-12-05T20:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.334076 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.334126 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.334143 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.334166 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.334184 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:17Z","lastTransitionTime":"2025-12-05T20:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.437006 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.437109 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.437128 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.437155 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.437174 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:17Z","lastTransitionTime":"2025-12-05T20:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.540546 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.540621 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.540644 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.540676 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.540698 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:17Z","lastTransitionTime":"2025-12-05T20:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.644201 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.644272 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.644295 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.644326 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.644348 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:17Z","lastTransitionTime":"2025-12-05T20:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.747800 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.747867 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.747908 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.747948 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.747961 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:17Z","lastTransitionTime":"2025-12-05T20:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.850832 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.850972 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.850999 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.851071 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.851095 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:17Z","lastTransitionTime":"2025-12-05T20:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.953250 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.953290 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.953300 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.953313 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:17 crc kubenswrapper[4885]: I1205 20:07:17.953322 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:17Z","lastTransitionTime":"2025-12-05T20:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.056291 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.056368 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.056393 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.056425 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.056444 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:18Z","lastTransitionTime":"2025-12-05T20:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.159696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.159779 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.159799 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.159824 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.159842 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:18Z","lastTransitionTime":"2025-12-05T20:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.262827 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.262937 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.262959 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.262984 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.263004 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:18Z","lastTransitionTime":"2025-12-05T20:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.365696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.365761 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.365778 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.365803 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.365821 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:18Z","lastTransitionTime":"2025-12-05T20:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.468515 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.468592 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.468614 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.468641 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.468657 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:18Z","lastTransitionTime":"2025-12-05T20:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.476177 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.476252 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.476289 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.476320 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.476338 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:07:18Z","lastTransitionTime":"2025-12-05T20:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.543128 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc"] Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.543674 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.546794 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.548050 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.548352 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.548645 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.671417 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/36ef2326-9b4c-467d-8787-7573194e853c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2l9wc\" (UID: \"36ef2326-9b4c-467d-8787-7573194e853c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.671494 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36ef2326-9b4c-467d-8787-7573194e853c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2l9wc\" (UID: \"36ef2326-9b4c-467d-8787-7573194e853c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.671708 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36ef2326-9b4c-467d-8787-7573194e853c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2l9wc\" (UID: \"36ef2326-9b4c-467d-8787-7573194e853c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.671774 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/36ef2326-9b4c-467d-8787-7573194e853c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2l9wc\" (UID: \"36ef2326-9b4c-467d-8787-7573194e853c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.671824 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36ef2326-9b4c-467d-8787-7573194e853c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2l9wc\" (UID: \"36ef2326-9b4c-467d-8787-7573194e853c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.772627 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/36ef2326-9b4c-467d-8787-7573194e853c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2l9wc\" (UID: \"36ef2326-9b4c-467d-8787-7573194e853c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.772688 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36ef2326-9b4c-467d-8787-7573194e853c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2l9wc\" (UID: \"36ef2326-9b4c-467d-8787-7573194e853c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.772765 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36ef2326-9b4c-467d-8787-7573194e853c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2l9wc\" (UID: \"36ef2326-9b4c-467d-8787-7573194e853c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.772798 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/36ef2326-9b4c-467d-8787-7573194e853c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2l9wc\" (UID: \"36ef2326-9b4c-467d-8787-7573194e853c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.772828 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36ef2326-9b4c-467d-8787-7573194e853c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2l9wc\" (UID: \"36ef2326-9b4c-467d-8787-7573194e853c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.772883 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/36ef2326-9b4c-467d-8787-7573194e853c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2l9wc\" (UID: \"36ef2326-9b4c-467d-8787-7573194e853c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.772809 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/36ef2326-9b4c-467d-8787-7573194e853c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2l9wc\" (UID: \"36ef2326-9b4c-467d-8787-7573194e853c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.774729 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36ef2326-9b4c-467d-8787-7573194e853c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2l9wc\" (UID: \"36ef2326-9b4c-467d-8787-7573194e853c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.782127 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36ef2326-9b4c-467d-8787-7573194e853c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2l9wc\" (UID: \"36ef2326-9b4c-467d-8787-7573194e853c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.804150 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36ef2326-9b4c-467d-8787-7573194e853c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2l9wc\" (UID: \"36ef2326-9b4c-467d-8787-7573194e853c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:18 crc kubenswrapper[4885]: I1205 20:07:18.870244 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" Dec 05 20:07:19 crc kubenswrapper[4885]: I1205 20:07:19.172188 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:19 crc kubenswrapper[4885]: I1205 20:07:19.172233 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:19 crc kubenswrapper[4885]: I1205 20:07:19.172229 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:19 crc kubenswrapper[4885]: I1205 20:07:19.172280 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:19 crc kubenswrapper[4885]: E1205 20:07:19.173291 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:19 crc kubenswrapper[4885]: E1205 20:07:19.173473 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:19 crc kubenswrapper[4885]: E1205 20:07:19.173620 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:19 crc kubenswrapper[4885]: E1205 20:07:19.173714 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:19 crc kubenswrapper[4885]: I1205 20:07:19.174236 4885 scope.go:117] "RemoveContainer" containerID="ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d" Dec 05 20:07:19 crc kubenswrapper[4885]: E1205 20:07:19.174517 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" Dec 05 20:07:19 crc kubenswrapper[4885]: I1205 20:07:19.779514 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" event={"ID":"36ef2326-9b4c-467d-8787-7573194e853c","Type":"ContainerStarted","Data":"2e4625df7999cd427f34c59e8339f86922fb4196766b1cbae3474bf19f23e7e7"} Dec 05 20:07:19 crc kubenswrapper[4885]: I1205 20:07:19.779599 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" event={"ID":"36ef2326-9b4c-467d-8787-7573194e853c","Type":"ContainerStarted","Data":"88d79a56141feca888e424d3569e427f493495f4a18ba65f628fc4d3f3d721fc"} Dec 05 20:07:19 crc kubenswrapper[4885]: I1205 20:07:19.794658 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2l9wc" podStartSLOduration=82.794631046 podStartE2EDuration="1m22.794631046s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:07:19.793310301 +0000 UTC m=+105.090126042" watchObservedRunningTime="2025-12-05 20:07:19.794631046 +0000 UTC m=+105.091446747" Dec 05 20:07:21 crc kubenswrapper[4885]: I1205 20:07:21.172070 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:21 crc kubenswrapper[4885]: I1205 20:07:21.172128 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:21 crc kubenswrapper[4885]: I1205 20:07:21.172128 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:21 crc kubenswrapper[4885]: E1205 20:07:21.172225 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:21 crc kubenswrapper[4885]: E1205 20:07:21.172345 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:21 crc kubenswrapper[4885]: I1205 20:07:21.172395 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:21 crc kubenswrapper[4885]: E1205 20:07:21.172480 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:21 crc kubenswrapper[4885]: E1205 20:07:21.172586 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:23 crc kubenswrapper[4885]: I1205 20:07:23.172352 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:23 crc kubenswrapper[4885]: I1205 20:07:23.172377 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:23 crc kubenswrapper[4885]: E1205 20:07:23.173354 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:23 crc kubenswrapper[4885]: I1205 20:07:23.172506 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:23 crc kubenswrapper[4885]: I1205 20:07:23.172413 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:23 crc kubenswrapper[4885]: E1205 20:07:23.173435 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:23 crc kubenswrapper[4885]: E1205 20:07:23.173550 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:23 crc kubenswrapper[4885]: E1205 20:07:23.173789 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:25 crc kubenswrapper[4885]: I1205 20:07:25.171783 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:25 crc kubenswrapper[4885]: I1205 20:07:25.171783 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:25 crc kubenswrapper[4885]: I1205 20:07:25.171875 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:25 crc kubenswrapper[4885]: I1205 20:07:25.174118 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:25 crc kubenswrapper[4885]: E1205 20:07:25.174119 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:25 crc kubenswrapper[4885]: E1205 20:07:25.174240 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:25 crc kubenswrapper[4885]: E1205 20:07:25.174336 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:25 crc kubenswrapper[4885]: E1205 20:07:25.174516 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:27 crc kubenswrapper[4885]: I1205 20:07:27.172077 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:27 crc kubenswrapper[4885]: I1205 20:07:27.172089 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:27 crc kubenswrapper[4885]: I1205 20:07:27.172284 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:27 crc kubenswrapper[4885]: I1205 20:07:27.172327 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:27 crc kubenswrapper[4885]: E1205 20:07:27.172467 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:27 crc kubenswrapper[4885]: E1205 20:07:27.172504 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:27 crc kubenswrapper[4885]: E1205 20:07:27.172891 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:27 crc kubenswrapper[4885]: E1205 20:07:27.173102 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:29 crc kubenswrapper[4885]: I1205 20:07:29.172450 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:29 crc kubenswrapper[4885]: I1205 20:07:29.172479 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:29 crc kubenswrapper[4885]: E1205 20:07:29.172639 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:29 crc kubenswrapper[4885]: I1205 20:07:29.172687 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:29 crc kubenswrapper[4885]: I1205 20:07:29.172712 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:29 crc kubenswrapper[4885]: E1205 20:07:29.172851 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:29 crc kubenswrapper[4885]: E1205 20:07:29.173003 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:29 crc kubenswrapper[4885]: E1205 20:07:29.173185 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:30 crc kubenswrapper[4885]: I1205 20:07:30.827439 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmtwj_c6c25e90-efcc-490c-afef-970c3a62c809/kube-multus/1.log" Dec 05 20:07:30 crc kubenswrapper[4885]: I1205 20:07:30.828252 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmtwj_c6c25e90-efcc-490c-afef-970c3a62c809/kube-multus/0.log" Dec 05 20:07:30 crc kubenswrapper[4885]: I1205 20:07:30.828294 4885 generic.go:334] "Generic (PLEG): container finished" podID="c6c25e90-efcc-490c-afef-970c3a62c809" containerID="23633e674cb5832d0d0815f1e0ef1b70ffa2e6c2d92c3fc60d46c9ff7d4cc9ab" exitCode=1 Dec 05 20:07:30 crc kubenswrapper[4885]: I1205 20:07:30.828330 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmtwj" event={"ID":"c6c25e90-efcc-490c-afef-970c3a62c809","Type":"ContainerDied","Data":"23633e674cb5832d0d0815f1e0ef1b70ffa2e6c2d92c3fc60d46c9ff7d4cc9ab"} Dec 05 20:07:30 crc kubenswrapper[4885]: I1205 20:07:30.828367 4885 scope.go:117] "RemoveContainer" containerID="245ec9fd3abffff6d552526b2be01ace97d0c0202e556a8ec487aaba994a710d" Dec 05 20:07:30 crc kubenswrapper[4885]: I1205 20:07:30.828716 4885 scope.go:117] "RemoveContainer" containerID="23633e674cb5832d0d0815f1e0ef1b70ffa2e6c2d92c3fc60d46c9ff7d4cc9ab" Dec 05 20:07:30 crc kubenswrapper[4885]: E1205 20:07:30.828888 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-zmtwj_openshift-multus(c6c25e90-efcc-490c-afef-970c3a62c809)\"" pod="openshift-multus/multus-zmtwj" podUID="c6c25e90-efcc-490c-afef-970c3a62c809" Dec 05 20:07:31 crc kubenswrapper[4885]: I1205 20:07:31.171838 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:31 crc kubenswrapper[4885]: I1205 20:07:31.171929 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:31 crc kubenswrapper[4885]: I1205 20:07:31.171868 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:31 crc kubenswrapper[4885]: E1205 20:07:31.172061 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:31 crc kubenswrapper[4885]: I1205 20:07:31.172138 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:31 crc kubenswrapper[4885]: E1205 20:07:31.172285 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:31 crc kubenswrapper[4885]: E1205 20:07:31.172486 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:31 crc kubenswrapper[4885]: E1205 20:07:31.172624 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:31 crc kubenswrapper[4885]: I1205 20:07:31.832456 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmtwj_c6c25e90-efcc-490c-afef-970c3a62c809/kube-multus/1.log" Dec 05 20:07:33 crc kubenswrapper[4885]: I1205 20:07:33.172560 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:33 crc kubenswrapper[4885]: I1205 20:07:33.172665 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:33 crc kubenswrapper[4885]: I1205 20:07:33.172683 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:33 crc kubenswrapper[4885]: I1205 20:07:33.172772 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:33 crc kubenswrapper[4885]: E1205 20:07:33.172782 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:33 crc kubenswrapper[4885]: E1205 20:07:33.173014 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:33 crc kubenswrapper[4885]: E1205 20:07:33.173182 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:33 crc kubenswrapper[4885]: E1205 20:07:33.173360 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:34 crc kubenswrapper[4885]: I1205 20:07:34.175780 4885 scope.go:117] "RemoveContainer" containerID="ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d" Dec 05 20:07:34 crc kubenswrapper[4885]: E1205 20:07:34.176227 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wx7m6_openshift-ovn-kubernetes(86ae690a-3705-45ae-8816-da5f33d2105e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" Dec 05 20:07:35 crc kubenswrapper[4885]: E1205 20:07:35.123746 4885 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 05 20:07:35 crc kubenswrapper[4885]: I1205 20:07:35.172585 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:35 crc kubenswrapper[4885]: I1205 20:07:35.172662 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:35 crc kubenswrapper[4885]: I1205 20:07:35.172607 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:35 crc kubenswrapper[4885]: I1205 20:07:35.172734 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:35 crc kubenswrapper[4885]: E1205 20:07:35.175056 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:35 crc kubenswrapper[4885]: E1205 20:07:35.175144 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:35 crc kubenswrapper[4885]: E1205 20:07:35.175331 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:35 crc kubenswrapper[4885]: E1205 20:07:35.175490 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:35 crc kubenswrapper[4885]: E1205 20:07:35.277880 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:07:37 crc kubenswrapper[4885]: I1205 20:07:37.172188 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:37 crc kubenswrapper[4885]: I1205 20:07:37.172277 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:37 crc kubenswrapper[4885]: E1205 20:07:37.172388 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:37 crc kubenswrapper[4885]: I1205 20:07:37.172495 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:37 crc kubenswrapper[4885]: E1205 20:07:37.172611 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:37 crc kubenswrapper[4885]: I1205 20:07:37.172205 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:37 crc kubenswrapper[4885]: E1205 20:07:37.172710 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:37 crc kubenswrapper[4885]: E1205 20:07:37.172785 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:39 crc kubenswrapper[4885]: I1205 20:07:39.172673 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:39 crc kubenswrapper[4885]: I1205 20:07:39.172702 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:39 crc kubenswrapper[4885]: I1205 20:07:39.172762 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:39 crc kubenswrapper[4885]: E1205 20:07:39.172807 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:39 crc kubenswrapper[4885]: I1205 20:07:39.173006 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:39 crc kubenswrapper[4885]: E1205 20:07:39.173043 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:39 crc kubenswrapper[4885]: E1205 20:07:39.173109 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:39 crc kubenswrapper[4885]: E1205 20:07:39.173177 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:40 crc kubenswrapper[4885]: E1205 20:07:40.279918 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:07:41 crc kubenswrapper[4885]: I1205 20:07:41.171953 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:41 crc kubenswrapper[4885]: I1205 20:07:41.172011 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:41 crc kubenswrapper[4885]: I1205 20:07:41.172149 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:41 crc kubenswrapper[4885]: I1205 20:07:41.172381 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:41 crc kubenswrapper[4885]: E1205 20:07:41.172358 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:41 crc kubenswrapper[4885]: E1205 20:07:41.172555 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:41 crc kubenswrapper[4885]: E1205 20:07:41.172973 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:41 crc kubenswrapper[4885]: E1205 20:07:41.173079 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:43 crc kubenswrapper[4885]: I1205 20:07:43.172157 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:43 crc kubenswrapper[4885]: I1205 20:07:43.172218 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:43 crc kubenswrapper[4885]: E1205 20:07:43.172358 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:43 crc kubenswrapper[4885]: I1205 20:07:43.172393 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:43 crc kubenswrapper[4885]: I1205 20:07:43.172377 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:43 crc kubenswrapper[4885]: E1205 20:07:43.172498 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:43 crc kubenswrapper[4885]: E1205 20:07:43.172574 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:43 crc kubenswrapper[4885]: E1205 20:07:43.172662 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:45 crc kubenswrapper[4885]: I1205 20:07:45.171988 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:45 crc kubenswrapper[4885]: E1205 20:07:45.173432 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:45 crc kubenswrapper[4885]: I1205 20:07:45.173470 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:45 crc kubenswrapper[4885]: I1205 20:07:45.173553 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:45 crc kubenswrapper[4885]: E1205 20:07:45.173738 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:45 crc kubenswrapper[4885]: I1205 20:07:45.173786 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:45 crc kubenswrapper[4885]: E1205 20:07:45.174179 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:45 crc kubenswrapper[4885]: I1205 20:07:45.174444 4885 scope.go:117] "RemoveContainer" containerID="23633e674cb5832d0d0815f1e0ef1b70ffa2e6c2d92c3fc60d46c9ff7d4cc9ab" Dec 05 20:07:45 crc kubenswrapper[4885]: E1205 20:07:45.174491 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:45 crc kubenswrapper[4885]: E1205 20:07:45.280822 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:07:45 crc kubenswrapper[4885]: I1205 20:07:45.884236 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmtwj_c6c25e90-efcc-490c-afef-970c3a62c809/kube-multus/1.log" Dec 05 20:07:45 crc kubenswrapper[4885]: I1205 20:07:45.885059 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmtwj" event={"ID":"c6c25e90-efcc-490c-afef-970c3a62c809","Type":"ContainerStarted","Data":"d0608305a462e681e80ef2ee794a2cc5f59edbf5e205a15f06bd9821cf14f5ad"} Dec 05 20:07:47 crc kubenswrapper[4885]: I1205 20:07:47.172295 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:47 crc kubenswrapper[4885]: I1205 20:07:47.172408 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:47 crc kubenswrapper[4885]: I1205 20:07:47.172453 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:47 crc kubenswrapper[4885]: E1205 20:07:47.172479 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:47 crc kubenswrapper[4885]: I1205 20:07:47.172496 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:47 crc kubenswrapper[4885]: E1205 20:07:47.172626 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:47 crc kubenswrapper[4885]: E1205 20:07:47.172795 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:47 crc kubenswrapper[4885]: E1205 20:07:47.172885 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:48 crc kubenswrapper[4885]: I1205 20:07:48.173624 4885 scope.go:117] "RemoveContainer" containerID="ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d" Dec 05 20:07:48 crc kubenswrapper[4885]: I1205 20:07:48.898892 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovnkube-controller/3.log" Dec 05 20:07:48 crc kubenswrapper[4885]: I1205 20:07:48.902090 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerStarted","Data":"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420"} Dec 05 20:07:48 crc kubenswrapper[4885]: I1205 20:07:48.902446 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:07:48 crc kubenswrapper[4885]: I1205 20:07:48.933945 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podStartSLOduration=111.933928032 podStartE2EDuration="1m51.933928032s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:07:48.932678701 +0000 UTC m=+134.229494432" watchObservedRunningTime="2025-12-05 20:07:48.933928032 +0000 UTC m=+134.230743693" Dec 05 20:07:49 crc kubenswrapper[4885]: I1205 20:07:49.029442 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2jdj4"] Dec 05 20:07:49 crc kubenswrapper[4885]: I1205 20:07:49.029552 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:49 crc kubenswrapper[4885]: E1205 20:07:49.029654 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:49 crc kubenswrapper[4885]: I1205 20:07:49.172735 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:49 crc kubenswrapper[4885]: I1205 20:07:49.172763 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:49 crc kubenswrapper[4885]: E1205 20:07:49.172906 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:49 crc kubenswrapper[4885]: E1205 20:07:49.173062 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:49 crc kubenswrapper[4885]: I1205 20:07:49.173154 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:49 crc kubenswrapper[4885]: E1205 20:07:49.173269 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:50 crc kubenswrapper[4885]: E1205 20:07:50.282259 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:07:51 crc kubenswrapper[4885]: I1205 20:07:51.172150 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:51 crc kubenswrapper[4885]: I1205 20:07:51.172226 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:51 crc kubenswrapper[4885]: E1205 20:07:51.172332 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:51 crc kubenswrapper[4885]: E1205 20:07:51.172517 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:51 crc kubenswrapper[4885]: I1205 20:07:51.172172 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:51 crc kubenswrapper[4885]: E1205 20:07:51.172655 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:51 crc kubenswrapper[4885]: I1205 20:07:51.172753 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:51 crc kubenswrapper[4885]: E1205 20:07:51.172888 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:53 crc kubenswrapper[4885]: I1205 20:07:53.172668 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:53 crc kubenswrapper[4885]: I1205 20:07:53.172698 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:53 crc kubenswrapper[4885]: E1205 20:07:53.173209 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:53 crc kubenswrapper[4885]: I1205 20:07:53.172718 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:53 crc kubenswrapper[4885]: I1205 20:07:53.172811 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:53 crc kubenswrapper[4885]: E1205 20:07:53.173291 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:53 crc kubenswrapper[4885]: E1205 20:07:53.173371 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:53 crc kubenswrapper[4885]: E1205 20:07:53.173445 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:55 crc kubenswrapper[4885]: I1205 20:07:55.171688 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:55 crc kubenswrapper[4885]: I1205 20:07:55.171688 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:55 crc kubenswrapper[4885]: I1205 20:07:55.171739 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:55 crc kubenswrapper[4885]: I1205 20:07:55.172588 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:55 crc kubenswrapper[4885]: E1205 20:07:55.172576 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2jdj4" podUID="a5c0a952-e24a-49c2-b4ba-e20be61b840d" Dec 05 20:07:55 crc kubenswrapper[4885]: E1205 20:07:55.172648 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:07:55 crc kubenswrapper[4885]: E1205 20:07:55.172717 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:07:55 crc kubenswrapper[4885]: E1205 20:07:55.172773 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:07:57 crc kubenswrapper[4885]: I1205 20:07:57.171782 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:07:57 crc kubenswrapper[4885]: I1205 20:07:57.171871 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:07:57 crc kubenswrapper[4885]: I1205 20:07:57.171809 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:07:57 crc kubenswrapper[4885]: I1205 20:07:57.173008 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:07:57 crc kubenswrapper[4885]: I1205 20:07:57.174539 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 20:07:57 crc kubenswrapper[4885]: I1205 20:07:57.175643 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 20:07:57 crc kubenswrapper[4885]: I1205 20:07:57.175916 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 20:07:57 crc kubenswrapper[4885]: I1205 20:07:57.176237 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 20:07:57 crc kubenswrapper[4885]: I1205 20:07:57.176244 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 20:07:57 crc kubenswrapper[4885]: I1205 20:07:57.176462 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.068327 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.117925 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-chk47"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.118730 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.123800 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.124167 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.124552 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.125248 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.126405 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.126591 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.130962 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-chk47"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.138551 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.239635 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9b67962-b520-4395-ba1a-72e0ca4e0240-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-chk47\" (UID: \"e9b67962-b520-4395-ba1a-72e0ca4e0240\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.239731 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vthl\" (UniqueName: \"kubernetes.io/projected/e9b67962-b520-4395-ba1a-72e0ca4e0240-kube-api-access-2vthl\") pod \"authentication-operator-69f744f599-chk47\" (UID: \"e9b67962-b520-4395-ba1a-72e0ca4e0240\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.239873 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9b67962-b520-4395-ba1a-72e0ca4e0240-config\") pod \"authentication-operator-69f744f599-chk47\" (UID: \"e9b67962-b520-4395-ba1a-72e0ca4e0240\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.239918 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9b67962-b520-4395-ba1a-72e0ca4e0240-service-ca-bundle\") pod \"authentication-operator-69f744f599-chk47\" (UID: \"e9b67962-b520-4395-ba1a-72e0ca4e0240\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.240013 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9b67962-b520-4395-ba1a-72e0ca4e0240-serving-cert\") pod \"authentication-operator-69f744f599-chk47\" (UID: \"e9b67962-b520-4395-ba1a-72e0ca4e0240\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.341298 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vthl\" (UniqueName: \"kubernetes.io/projected/e9b67962-b520-4395-ba1a-72e0ca4e0240-kube-api-access-2vthl\") pod \"authentication-operator-69f744f599-chk47\" (UID: \"e9b67962-b520-4395-ba1a-72e0ca4e0240\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.341356 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9b67962-b520-4395-ba1a-72e0ca4e0240-config\") pod \"authentication-operator-69f744f599-chk47\" (UID: \"e9b67962-b520-4395-ba1a-72e0ca4e0240\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.341374 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9b67962-b520-4395-ba1a-72e0ca4e0240-service-ca-bundle\") pod \"authentication-operator-69f744f599-chk47\" (UID: \"e9b67962-b520-4395-ba1a-72e0ca4e0240\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.341402 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9b67962-b520-4395-ba1a-72e0ca4e0240-serving-cert\") pod \"authentication-operator-69f744f599-chk47\" (UID: \"e9b67962-b520-4395-ba1a-72e0ca4e0240\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.341450 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9b67962-b520-4395-ba1a-72e0ca4e0240-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-chk47\" (UID: \"e9b67962-b520-4395-ba1a-72e0ca4e0240\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.343139 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9b67962-b520-4395-ba1a-72e0ca4e0240-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-chk47\" (UID: \"e9b67962-b520-4395-ba1a-72e0ca4e0240\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.343270 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9b67962-b520-4395-ba1a-72e0ca4e0240-service-ca-bundle\") pod \"authentication-operator-69f744f599-chk47\" (UID: \"e9b67962-b520-4395-ba1a-72e0ca4e0240\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.344752 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9b67962-b520-4395-ba1a-72e0ca4e0240-config\") pod \"authentication-operator-69f744f599-chk47\" (UID: \"e9b67962-b520-4395-ba1a-72e0ca4e0240\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.353947 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9b67962-b520-4395-ba1a-72e0ca4e0240-serving-cert\") pod \"authentication-operator-69f744f599-chk47\" (UID: \"e9b67962-b520-4395-ba1a-72e0ca4e0240\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.369837 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vthl\" (UniqueName: \"kubernetes.io/projected/e9b67962-b520-4395-ba1a-72e0ca4e0240-kube-api-access-2vthl\") pod \"authentication-operator-69f744f599-chk47\" (UID: \"e9b67962-b520-4395-ba1a-72e0ca4e0240\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.443930 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.570209 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.570955 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.571867 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-jdls9"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.572686 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jdls9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.577001 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.577441 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8z75j"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.577939 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.578380 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.578523 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.578736 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.579168 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.585416 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wdpql"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.586077 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.587996 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.592534 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-jdrlk"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.594096 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.594186 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.594251 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.594186 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.594487 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.594563 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.595282 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.596184 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.598267 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5g9z9"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.598771 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.599654 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.600227 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xqrhh"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.600663 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.600995 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.601273 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.601764 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.607127 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.621557 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.621829 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.621906 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.621843 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.626394 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.633460 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.633536 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.633461 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.633959 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 20:07:59 crc kubenswrapper[4885]: W1205 20:07:59.634486 4885 reflector.go:561] object-"openshift-route-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Dec 05 20:07:59 crc kubenswrapper[4885]: E1205 20:07:59.634542 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.644613 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.644673 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.644796 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.644839 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.644872 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.644916 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.644951 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.644999 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.645186 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.645261 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.645342 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.645435 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.645526 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.645597 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.645671 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.645765 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.645956 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.646196 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.646279 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.646354 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.646542 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.646758 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.646928 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.647320 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.647400 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bctth"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.647736 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.647988 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648476 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-oauth-serving-cert\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648528 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e273248-d3b5-4248-9e30-06ae7c6ab889-config\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648566 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2wwv\" (UniqueName: \"kubernetes.io/projected/ac8a91ac-c2d0-40b7-aa23-cf2a0081e550-kube-api-access-c2wwv\") pod \"cluster-image-registry-operator-dc59b4c8b-mhxlk\" (UID: \"ac8a91ac-c2d0-40b7-aa23-cf2a0081e550\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648588 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/543415d6-6aec-42f4-953f-3a760aefe1f2-console-oauth-config\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648607 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e273248-d3b5-4248-9e30-06ae7c6ab889-serving-cert\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648629 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34f5add1-5763-4b13-8058-e1b6fbbb4740-client-ca\") pod \"route-controller-manager-6576b87f9c-sp659\" (UID: \"34f5add1-5763-4b13-8058-e1b6fbbb4740\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648649 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6e273248-d3b5-4248-9e30-06ae7c6ab889-audit\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648669 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-console-config\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648691 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f5add1-5763-4b13-8058-e1b6fbbb4740-config\") pod \"route-controller-manager-6576b87f9c-sp659\" (UID: \"34f5add1-5763-4b13-8058-e1b6fbbb4740\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648714 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqjmb\" (UniqueName: \"kubernetes.io/projected/6e273248-d3b5-4248-9e30-06ae7c6ab889-kube-api-access-mqjmb\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648733 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648753 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac8a91ac-c2d0-40b7-aa23-cf2a0081e550-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mhxlk\" (UID: \"ac8a91ac-c2d0-40b7-aa23-cf2a0081e550\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648772 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6e273248-d3b5-4248-9e30-06ae7c6ab889-image-import-ca\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648790 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e273248-d3b5-4248-9e30-06ae7c6ab889-audit-dir\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648811 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4e08709-09d1-497f-9d79-83b90f495bbf-config\") pod \"machine-approver-56656f9798-6xd4d\" (UID: \"d4e08709-09d1-497f-9d79-83b90f495bbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648832 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/875142b2-83ed-4d64-88b5-a885640981d1-etcd-ca\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648852 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d3b6b866-d318-454f-9730-e56de394d130-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8z75j\" (UID: \"d3b6b866-d318-454f-9730-e56de394d130\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648875 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648893 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-audit-policies\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648913 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/543415d6-6aec-42f4-953f-3a760aefe1f2-console-serving-cert\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648933 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67bs2\" (UniqueName: \"kubernetes.io/projected/34f5add1-5763-4b13-8058-e1b6fbbb4740-kube-api-access-67bs2\") pod \"route-controller-manager-6576b87f9c-sp659\" (UID: \"34f5add1-5763-4b13-8058-e1b6fbbb4740\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648951 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/875142b2-83ed-4d64-88b5-a885640981d1-etcd-client\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648971 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shw2s\" (UniqueName: \"kubernetes.io/projected/d4e08709-09d1-497f-9d79-83b90f495bbf-kube-api-access-shw2s\") pod \"machine-approver-56656f9798-6xd4d\" (UID: \"d4e08709-09d1-497f-9d79-83b90f495bbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.648989 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/875142b2-83ed-4d64-88b5-a885640981d1-serving-cert\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.649053 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6986h\" (UniqueName: \"kubernetes.io/projected/543415d6-6aec-42f4-953f-3a760aefe1f2-kube-api-access-6986h\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.649083 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34f5add1-5763-4b13-8058-e1b6fbbb4740-serving-cert\") pod \"route-controller-manager-6576b87f9c-sp659\" (UID: \"34f5add1-5763-4b13-8058-e1b6fbbb4740\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.649102 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e273248-d3b5-4248-9e30-06ae7c6ab889-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.649120 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6e273248-d3b5-4248-9e30-06ae7c6ab889-encryption-config\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.649688 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac8a91ac-c2d0-40b7-aa23-cf2a0081e550-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mhxlk\" (UID: \"ac8a91ac-c2d0-40b7-aa23-cf2a0081e550\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.650284 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-service-ca\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.650322 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d4e08709-09d1-497f-9d79-83b90f495bbf-auth-proxy-config\") pod \"machine-approver-56656f9798-6xd4d\" (UID: \"d4e08709-09d1-497f-9d79-83b90f495bbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.650350 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-encryption-config\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.650381 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9e46b72-f528-4f07-8b1e-96b98302ac86-serving-cert\") pod \"controller-manager-879f6c89f-xqrhh\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.650403 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42769059-74c9-48f7-bdb2-7b97903610ba-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-95dk6\" (UID: \"42769059-74c9-48f7-bdb2-7b97903610ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.650423 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-audit-dir\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.650444 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9tx6\" (UniqueName: \"kubernetes.io/projected/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-kube-api-access-n9tx6\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.650464 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmrmv\" (UniqueName: \"kubernetes.io/projected/42769059-74c9-48f7-bdb2-7b97903610ba-kube-api-access-mmrmv\") pod \"openshift-controller-manager-operator-756b6f6bc6-95dk6\" (UID: \"42769059-74c9-48f7-bdb2-7b97903610ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.651564 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac8a91ac-c2d0-40b7-aa23-cf2a0081e550-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mhxlk\" (UID: \"ac8a91ac-c2d0-40b7-aa23-cf2a0081e550\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.651597 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xqrhh\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.651618 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.651620 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42769059-74c9-48f7-bdb2-7b97903610ba-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-95dk6\" (UID: \"42769059-74c9-48f7-bdb2-7b97903610ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.651745 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.651791 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fgdjd"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.651750 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-client-ca\") pod \"controller-manager-879f6c89f-xqrhh\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.651916 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3b6b866-d318-454f-9730-e56de394d130-serving-cert\") pod \"openshift-config-operator-7777fb866f-8z75j\" (UID: \"d3b6b866-d318-454f-9730-e56de394d130\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.651925 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.651938 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6e273248-d3b5-4248-9e30-06ae7c6ab889-etcd-serving-ca\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.651964 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-config\") pod \"controller-manager-879f6c89f-xqrhh\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.651981 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvx5q\" (UniqueName: \"kubernetes.io/projected/875142b2-83ed-4d64-88b5-a885640981d1-kube-api-access-pvx5q\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.651998 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6e273248-d3b5-4248-9e30-06ae7c6ab889-node-pullsecrets\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652012 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e273248-d3b5-4248-9e30-06ae7c6ab889-etcd-client\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652062 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652078 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t8g6\" (UniqueName: \"kubernetes.io/projected/d3b6b866-d318-454f-9730-e56de394d130-kube-api-access-4t8g6\") pod \"openshift-config-operator-7777fb866f-8z75j\" (UID: \"d3b6b866-d318-454f-9730-e56de394d130\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652102 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-etcd-client\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652122 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-trusted-ca-bundle\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652139 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8524w\" (UniqueName: \"kubernetes.io/projected/98e1f477-999e-4584-a373-c07abd3a938c-kube-api-access-8524w\") pod \"downloads-7954f5f757-jdls9\" (UID: \"98e1f477-999e-4584-a373-c07abd3a938c\") " pod="openshift-console/downloads-7954f5f757-jdls9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652153 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-serving-cert\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652173 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hclpm\" (UniqueName: \"kubernetes.io/projected/c9e46b72-f528-4f07-8b1e-96b98302ac86-kube-api-access-hclpm\") pod \"controller-manager-879f6c89f-xqrhh\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652188 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/875142b2-83ed-4d64-88b5-a885640981d1-etcd-service-ca\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652199 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652220 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652207 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d4e08709-09d1-497f-9d79-83b90f495bbf-machine-approver-tls\") pod \"machine-approver-56656f9798-6xd4d\" (UID: \"d4e08709-09d1-497f-9d79-83b90f495bbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652286 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-z2sb2"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652297 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/875142b2-83ed-4d64-88b5-a885640981d1-config\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652318 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652409 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652417 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652452 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652522 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652550 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652593 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-z2sb2" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652625 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652680 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652722 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652864 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.653000 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.652952 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fgdjd" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.653126 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.653141 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.653251 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.653402 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.653533 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.653613 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.653707 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.653832 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.653910 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.659876 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.663595 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.665220 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.666623 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.666838 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.667007 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.667072 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.673454 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.673599 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.673730 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.673864 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.674221 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.674407 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.674549 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.674585 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.674648 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.674742 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.681216 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.681720 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.682875 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.683522 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.684231 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.684314 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.684413 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.684436 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.684542 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.684557 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.684603 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qcd9b"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.684692 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.685013 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-x67dc"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.685405 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-x67dc" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.685616 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.687424 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.691757 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.691798 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.691755 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.692725 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.692776 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.702313 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9n5ft"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.703782 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9n5ft" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.704860 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.707416 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.708952 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.711144 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.713131 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.729552 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.730239 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.730703 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.731000 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hfsls"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.731163 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.731377 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hfsls" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.732566 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.736171 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.741107 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.741496 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.741875 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.744199 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-x24g6"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.744684 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-x24g6" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.744976 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.751121 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.751684 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.752318 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c9jnh"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.752875 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-c9jnh" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.753210 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.753363 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.754254 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.754696 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.754921 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-console-config\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.754958 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f5add1-5763-4b13-8058-e1b6fbbb4740-config\") pod \"route-controller-manager-6576b87f9c-sp659\" (UID: \"34f5add1-5763-4b13-8058-e1b6fbbb4740\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.754978 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqjmb\" (UniqueName: \"kubernetes.io/projected/6e273248-d3b5-4248-9e30-06ae7c6ab889-kube-api-access-mqjmb\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.754995 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755013 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac8a91ac-c2d0-40b7-aa23-cf2a0081e550-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mhxlk\" (UID: \"ac8a91ac-c2d0-40b7-aa23-cf2a0081e550\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755049 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6e273248-d3b5-4248-9e30-06ae7c6ab889-image-import-ca\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755069 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e273248-d3b5-4248-9e30-06ae7c6ab889-audit-dir\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755089 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4e08709-09d1-497f-9d79-83b90f495bbf-config\") pod \"machine-approver-56656f9798-6xd4d\" (UID: \"d4e08709-09d1-497f-9d79-83b90f495bbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755110 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/875142b2-83ed-4d64-88b5-a885640981d1-etcd-ca\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755118 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755130 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d3b6b866-d318-454f-9730-e56de394d130-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8z75j\" (UID: \"d3b6b866-d318-454f-9730-e56de394d130\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755155 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755177 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5953ed8-08cc-443c-a9e0-be5f96f3d8dd-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dcdhz\" (UID: \"a5953ed8-08cc-443c-a9e0-be5f96f3d8dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755198 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-audit-policies\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755229 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5953ed8-08cc-443c-a9e0-be5f96f3d8dd-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dcdhz\" (UID: \"a5953ed8-08cc-443c-a9e0-be5f96f3d8dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755254 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/543415d6-6aec-42f4-953f-3a760aefe1f2-console-serving-cert\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755277 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67bs2\" (UniqueName: \"kubernetes.io/projected/34f5add1-5763-4b13-8058-e1b6fbbb4740-kube-api-access-67bs2\") pod \"route-controller-manager-6576b87f9c-sp659\" (UID: \"34f5add1-5763-4b13-8058-e1b6fbbb4740\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755297 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/875142b2-83ed-4d64-88b5-a885640981d1-etcd-client\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755304 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755319 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shw2s\" (UniqueName: \"kubernetes.io/projected/d4e08709-09d1-497f-9d79-83b90f495bbf-kube-api-access-shw2s\") pod \"machine-approver-56656f9798-6xd4d\" (UID: \"d4e08709-09d1-497f-9d79-83b90f495bbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755345 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/875142b2-83ed-4d64-88b5-a885640981d1-serving-cert\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755369 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5953ed8-08cc-443c-a9e0-be5f96f3d8dd-config\") pod \"kube-apiserver-operator-766d6c64bb-dcdhz\" (UID: \"a5953ed8-08cc-443c-a9e0-be5f96f3d8dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755413 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6986h\" (UniqueName: \"kubernetes.io/projected/543415d6-6aec-42f4-953f-3a760aefe1f2-kube-api-access-6986h\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755434 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34f5add1-5763-4b13-8058-e1b6fbbb4740-serving-cert\") pod \"route-controller-manager-6576b87f9c-sp659\" (UID: \"34f5add1-5763-4b13-8058-e1b6fbbb4740\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755451 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e273248-d3b5-4248-9e30-06ae7c6ab889-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755465 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6e273248-d3b5-4248-9e30-06ae7c6ab889-encryption-config\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755483 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgv6t\" (UniqueName: \"kubernetes.io/projected/3a17b36b-0a7a-427b-9602-27aa06f15f73-kube-api-access-zgv6t\") pod \"openshift-apiserver-operator-796bbdcf4f-h9f8n\" (UID: \"3a17b36b-0a7a-427b-9602-27aa06f15f73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755501 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac8a91ac-c2d0-40b7-aa23-cf2a0081e550-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mhxlk\" (UID: \"ac8a91ac-c2d0-40b7-aa23-cf2a0081e550\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755517 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-service-ca\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755532 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d4e08709-09d1-497f-9d79-83b90f495bbf-auth-proxy-config\") pod \"machine-approver-56656f9798-6xd4d\" (UID: \"d4e08709-09d1-497f-9d79-83b90f495bbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755547 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-encryption-config\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755562 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6086abe8-3970-4d1c-9f3f-8075de87b8ec-config\") pod \"console-operator-58897d9998-z2sb2\" (UID: \"6086abe8-3970-4d1c-9f3f-8075de87b8ec\") " pod="openshift-console-operator/console-operator-58897d9998-z2sb2" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755585 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9e46b72-f528-4f07-8b1e-96b98302ac86-serving-cert\") pod \"controller-manager-879f6c89f-xqrhh\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755599 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42769059-74c9-48f7-bdb2-7b97903610ba-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-95dk6\" (UID: \"42769059-74c9-48f7-bdb2-7b97903610ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755613 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-audit-dir\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755629 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9tx6\" (UniqueName: \"kubernetes.io/projected/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-kube-api-access-n9tx6\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755644 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmrmv\" (UniqueName: \"kubernetes.io/projected/42769059-74c9-48f7-bdb2-7b97903610ba-kube-api-access-mmrmv\") pod \"openshift-controller-manager-operator-756b6f6bc6-95dk6\" (UID: \"42769059-74c9-48f7-bdb2-7b97903610ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755659 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac8a91ac-c2d0-40b7-aa23-cf2a0081e550-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mhxlk\" (UID: \"ac8a91ac-c2d0-40b7-aa23-cf2a0081e550\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755674 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xqrhh\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755689 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42769059-74c9-48f7-bdb2-7b97903610ba-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-95dk6\" (UID: \"42769059-74c9-48f7-bdb2-7b97903610ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755703 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-client-ca\") pod \"controller-manager-879f6c89f-xqrhh\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755717 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3b6b866-d318-454f-9730-e56de394d130-serving-cert\") pod \"openshift-config-operator-7777fb866f-8z75j\" (UID: \"d3b6b866-d318-454f-9730-e56de394d130\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755754 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6e273248-d3b5-4248-9e30-06ae7c6ab889-etcd-serving-ca\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755775 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq5pn\" (UniqueName: \"kubernetes.io/projected/f1540bd7-e50c-4f68-864c-a58e8c81bb03-kube-api-access-jq5pn\") pod \"migrator-59844c95c7-9n5ft\" (UID: \"f1540bd7-e50c-4f68-864c-a58e8c81bb03\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9n5ft" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755796 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-config\") pod \"controller-manager-879f6c89f-xqrhh\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755811 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvx5q\" (UniqueName: \"kubernetes.io/projected/875142b2-83ed-4d64-88b5-a885640981d1-kube-api-access-pvx5q\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755826 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6e273248-d3b5-4248-9e30-06ae7c6ab889-node-pullsecrets\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755840 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e273248-d3b5-4248-9e30-06ae7c6ab889-etcd-client\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755862 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t8g6\" (UniqueName: \"kubernetes.io/projected/d3b6b866-d318-454f-9730-e56de394d130-kube-api-access-4t8g6\") pod \"openshift-config-operator-7777fb866f-8z75j\" (UID: \"d3b6b866-d318-454f-9730-e56de394d130\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755878 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-etcd-client\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755893 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-trusted-ca-bundle\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755909 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8524w\" (UniqueName: \"kubernetes.io/projected/98e1f477-999e-4584-a373-c07abd3a938c-kube-api-access-8524w\") pod \"downloads-7954f5f757-jdls9\" (UID: \"98e1f477-999e-4584-a373-c07abd3a938c\") " pod="openshift-console/downloads-7954f5f757-jdls9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755924 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-serving-cert\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755941 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6086abe8-3970-4d1c-9f3f-8075de87b8ec-serving-cert\") pod \"console-operator-58897d9998-z2sb2\" (UID: \"6086abe8-3970-4d1c-9f3f-8075de87b8ec\") " pod="openshift-console-operator/console-operator-58897d9998-z2sb2" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755958 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hclpm\" (UniqueName: \"kubernetes.io/projected/c9e46b72-f528-4f07-8b1e-96b98302ac86-kube-api-access-hclpm\") pod \"controller-manager-879f6c89f-xqrhh\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755976 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/875142b2-83ed-4d64-88b5-a885640981d1-etcd-service-ca\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.755994 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d4e08709-09d1-497f-9d79-83b90f495bbf-machine-approver-tls\") pod \"machine-approver-56656f9798-6xd4d\" (UID: \"d4e08709-09d1-497f-9d79-83b90f495bbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.756010 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx9z6\" (UniqueName: \"kubernetes.io/projected/6086abe8-3970-4d1c-9f3f-8075de87b8ec-kube-api-access-bx9z6\") pod \"console-operator-58897d9998-z2sb2\" (UID: \"6086abe8-3970-4d1c-9f3f-8075de87b8ec\") " pod="openshift-console-operator/console-operator-58897d9998-z2sb2" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.756075 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/875142b2-83ed-4d64-88b5-a885640981d1-config\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.756092 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-oauth-serving-cert\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.756107 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e273248-d3b5-4248-9e30-06ae7c6ab889-config\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.756123 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6086abe8-3970-4d1c-9f3f-8075de87b8ec-trusted-ca\") pod \"console-operator-58897d9998-z2sb2\" (UID: \"6086abe8-3970-4d1c-9f3f-8075de87b8ec\") " pod="openshift-console-operator/console-operator-58897d9998-z2sb2" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.756146 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2wwv\" (UniqueName: \"kubernetes.io/projected/ac8a91ac-c2d0-40b7-aa23-cf2a0081e550-kube-api-access-c2wwv\") pod \"cluster-image-registry-operator-dc59b4c8b-mhxlk\" (UID: \"ac8a91ac-c2d0-40b7-aa23-cf2a0081e550\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.756162 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/543415d6-6aec-42f4-953f-3a760aefe1f2-console-oauth-config\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.756177 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e273248-d3b5-4248-9e30-06ae7c6ab889-serving-cert\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.756193 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a17b36b-0a7a-427b-9602-27aa06f15f73-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h9f8n\" (UID: \"3a17b36b-0a7a-427b-9602-27aa06f15f73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.756210 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34f5add1-5763-4b13-8058-e1b6fbbb4740-client-ca\") pod \"route-controller-manager-6576b87f9c-sp659\" (UID: \"34f5add1-5763-4b13-8058-e1b6fbbb4740\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.756225 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6e273248-d3b5-4248-9e30-06ae7c6ab889-audit\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.756242 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a17b36b-0a7a-427b-9602-27aa06f15f73-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h9f8n\" (UID: \"3a17b36b-0a7a-427b-9602-27aa06f15f73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.756526 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nznv9"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.756634 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.756817 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-audit-policies\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.757289 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nznv9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.757540 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-console-config\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.759319 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6e273248-d3b5-4248-9e30-06ae7c6ab889-etcd-serving-ca\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.759507 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f5add1-5763-4b13-8058-e1b6fbbb4740-config\") pod \"route-controller-manager-6576b87f9c-sp659\" (UID: \"34f5add1-5763-4b13-8058-e1b6fbbb4740\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.760003 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.760003 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.760960 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4e08709-09d1-497f-9d79-83b90f495bbf-config\") pod \"machine-approver-56656f9798-6xd4d\" (UID: \"d4e08709-09d1-497f-9d79-83b90f495bbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.761427 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-config\") pod \"controller-manager-879f6c89f-xqrhh\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.761534 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6e273248-d3b5-4248-9e30-06ae7c6ab889-node-pullsecrets\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.761800 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/875142b2-83ed-4d64-88b5-a885640981d1-etcd-service-ca\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.761845 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6e273248-d3b5-4248-9e30-06ae7c6ab889-image-import-ca\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.761898 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e273248-d3b5-4248-9e30-06ae7c6ab889-audit-dir\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.762295 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d3b6b866-d318-454f-9730-e56de394d130-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8z75j\" (UID: \"d3b6b866-d318-454f-9730-e56de394d130\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.762511 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-89b4n"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.763675 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac8a91ac-c2d0-40b7-aa23-cf2a0081e550-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mhxlk\" (UID: \"ac8a91ac-c2d0-40b7-aa23-cf2a0081e550\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.764622 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n7qfd"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.764810 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-etcd-client\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.764930 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bvl5h"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.765313 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/875142b2-83ed-4d64-88b5-a885640981d1-etcd-ca\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.765631 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e273248-d3b5-4248-9e30-06ae7c6ab889-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.765701 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.765816 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-trusted-ca-bundle\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.766297 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.766459 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.769140 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-serving-cert\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.769289 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34f5add1-5763-4b13-8058-e1b6fbbb4740-client-ca\") pod \"route-controller-manager-6576b87f9c-sp659\" (UID: \"34f5add1-5763-4b13-8058-e1b6fbbb4740\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.772910 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/875142b2-83ed-4d64-88b5-a885640981d1-serving-cert\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.769746 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mfhrt"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.769718 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6e273248-d3b5-4248-9e30-06ae7c6ab889-audit\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.770307 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-oauth-serving-cert\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.773322 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42769059-74c9-48f7-bdb2-7b97903610ba-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-95dk6\" (UID: \"42769059-74c9-48f7-bdb2-7b97903610ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.770532 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e273248-d3b5-4248-9e30-06ae7c6ab889-config\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.770636 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d4e08709-09d1-497f-9d79-83b90f495bbf-auth-proxy-config\") pod \"machine-approver-56656f9798-6xd4d\" (UID: \"d4e08709-09d1-497f-9d79-83b90f495bbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.771141 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-service-ca\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.771361 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-client-ca\") pod \"controller-manager-879f6c89f-xqrhh\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.772293 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xqrhh\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.772548 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/875142b2-83ed-4d64-88b5-a885640981d1-etcd-client\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.770839 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-audit-dir\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.773945 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e273248-d3b5-4248-9e30-06ae7c6ab889-serving-cert\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.774233 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mfhrt" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.769831 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/875142b2-83ed-4d64-88b5-a885640981d1-config\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.774364 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac8a91ac-c2d0-40b7-aa23-cf2a0081e550-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mhxlk\" (UID: \"ac8a91ac-c2d0-40b7-aa23-cf2a0081e550\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.774598 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.774679 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/543415d6-6aec-42f4-953f-3a760aefe1f2-console-serving-cert\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.775590 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e273248-d3b5-4248-9e30-06ae7c6ab889-etcd-client\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.775795 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-encryption-config\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.777278 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34f5add1-5763-4b13-8058-e1b6fbbb4740-serving-cert\") pod \"route-controller-manager-6576b87f9c-sp659\" (UID: \"34f5add1-5763-4b13-8058-e1b6fbbb4740\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.777705 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6e273248-d3b5-4248-9e30-06ae7c6ab889-encryption-config\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.777855 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3b6b866-d318-454f-9730-e56de394d130-serving-cert\") pod \"openshift-config-operator-7777fb866f-8z75j\" (UID: \"d3b6b866-d318-454f-9730-e56de394d130\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.778870 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42769059-74c9-48f7-bdb2-7b97903610ba-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-95dk6\" (UID: \"42769059-74c9-48f7-bdb2-7b97903610ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.778914 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.779606 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/543415d6-6aec-42f4-953f-3a760aefe1f2-console-oauth-config\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.779788 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.780487 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9e46b72-f528-4f07-8b1e-96b98302ac86-serving-cert\") pod \"controller-manager-879f6c89f-xqrhh\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.782227 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-h6jst"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.782976 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-h6jst" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.783779 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8tph2"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.784614 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8tph2" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.786530 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vs7jr"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.788563 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jdls9"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.788584 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.788663 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.789704 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8z75j"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.790735 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.792042 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jdrlk"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.793181 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.793265 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.803782 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.804529 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d4e08709-09d1-497f-9d79-83b90f495bbf-machine-approver-tls\") pod \"machine-approver-56656f9798-6xd4d\" (UID: \"d4e08709-09d1-497f-9d79-83b90f495bbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.808066 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.808629 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wdpql"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.810897 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.813556 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.815956 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.817299 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9n5ft"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.821618 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.822749 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.824157 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qcd9b"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.825688 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-z2sb2"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.829713 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5g9z9"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.830761 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hfsls"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.831757 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.832762 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fgdjd"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.833198 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.833788 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xqrhh"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.834822 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-x24g6"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.835791 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.836829 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bctth"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.837856 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.839067 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c9jnh"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.840245 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.841315 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p"] Dec 05 20:07:59 crc kubenswrapper[4885]: W1205 20:07:59.842628 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9b67962_b520_4395_ba1a_72e0ca4e0240.slice/crio-d878fadd9dae50157c89f0615ad38aa2ad903fbc818a866c3f70971a1dadfbe6 WatchSource:0}: Error finding container d878fadd9dae50157c89f0615ad38aa2ad903fbc818a866c3f70971a1dadfbe6: Status 404 returned error can't find the container with id d878fadd9dae50157c89f0615ad38aa2ad903fbc818a866c3f70971a1dadfbe6 Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.842717 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.843852 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mfhrt"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.845006 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.847541 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-x67dc"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.848113 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8tph2"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.850012 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.850272 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n7qfd"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.851326 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bvl5h"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.852441 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nznv9"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.853203 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.853494 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vs7jr"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.854424 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-chk47"] Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.857207 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq5pn\" (UniqueName: \"kubernetes.io/projected/f1540bd7-e50c-4f68-864c-a58e8c81bb03-kube-api-access-jq5pn\") pod \"migrator-59844c95c7-9n5ft\" (UID: \"f1540bd7-e50c-4f68-864c-a58e8c81bb03\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9n5ft" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.857271 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6086abe8-3970-4d1c-9f3f-8075de87b8ec-serving-cert\") pod \"console-operator-58897d9998-z2sb2\" (UID: \"6086abe8-3970-4d1c-9f3f-8075de87b8ec\") " pod="openshift-console-operator/console-operator-58897d9998-z2sb2" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.857296 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx9z6\" (UniqueName: \"kubernetes.io/projected/6086abe8-3970-4d1c-9f3f-8075de87b8ec-kube-api-access-bx9z6\") pod \"console-operator-58897d9998-z2sb2\" (UID: \"6086abe8-3970-4d1c-9f3f-8075de87b8ec\") " pod="openshift-console-operator/console-operator-58897d9998-z2sb2" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.857315 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6086abe8-3970-4d1c-9f3f-8075de87b8ec-trusted-ca\") pod \"console-operator-58897d9998-z2sb2\" (UID: \"6086abe8-3970-4d1c-9f3f-8075de87b8ec\") " pod="openshift-console-operator/console-operator-58897d9998-z2sb2" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.857342 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a17b36b-0a7a-427b-9602-27aa06f15f73-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h9f8n\" (UID: \"3a17b36b-0a7a-427b-9602-27aa06f15f73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.857362 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a17b36b-0a7a-427b-9602-27aa06f15f73-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h9f8n\" (UID: \"3a17b36b-0a7a-427b-9602-27aa06f15f73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.857392 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5953ed8-08cc-443c-a9e0-be5f96f3d8dd-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dcdhz\" (UID: \"a5953ed8-08cc-443c-a9e0-be5f96f3d8dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.857407 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5953ed8-08cc-443c-a9e0-be5f96f3d8dd-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dcdhz\" (UID: \"a5953ed8-08cc-443c-a9e0-be5f96f3d8dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.857441 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5953ed8-08cc-443c-a9e0-be5f96f3d8dd-config\") pod \"kube-apiserver-operator-766d6c64bb-dcdhz\" (UID: \"a5953ed8-08cc-443c-a9e0-be5f96f3d8dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.857494 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgv6t\" (UniqueName: \"kubernetes.io/projected/3a17b36b-0a7a-427b-9602-27aa06f15f73-kube-api-access-zgv6t\") pod \"openshift-apiserver-operator-796bbdcf4f-h9f8n\" (UID: \"3a17b36b-0a7a-427b-9602-27aa06f15f73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.857525 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6086abe8-3970-4d1c-9f3f-8075de87b8ec-config\") pod \"console-operator-58897d9998-z2sb2\" (UID: \"6086abe8-3970-4d1c-9f3f-8075de87b8ec\") " pod="openshift-console-operator/console-operator-58897d9998-z2sb2" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.858639 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6086abe8-3970-4d1c-9f3f-8075de87b8ec-config\") pod \"console-operator-58897d9998-z2sb2\" (UID: \"6086abe8-3970-4d1c-9f3f-8075de87b8ec\") " pod="openshift-console-operator/console-operator-58897d9998-z2sb2" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.858654 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6086abe8-3970-4d1c-9f3f-8075de87b8ec-trusted-ca\") pod \"console-operator-58897d9998-z2sb2\" (UID: \"6086abe8-3970-4d1c-9f3f-8075de87b8ec\") " pod="openshift-console-operator/console-operator-58897d9998-z2sb2" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.859585 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5953ed8-08cc-443c-a9e0-be5f96f3d8dd-config\") pod \"kube-apiserver-operator-766d6c64bb-dcdhz\" (UID: \"a5953ed8-08cc-443c-a9e0-be5f96f3d8dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.859826 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a17b36b-0a7a-427b-9602-27aa06f15f73-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h9f8n\" (UID: \"3a17b36b-0a7a-427b-9602-27aa06f15f73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.861099 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a17b36b-0a7a-427b-9602-27aa06f15f73-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h9f8n\" (UID: \"3a17b36b-0a7a-427b-9602-27aa06f15f73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.861178 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5953ed8-08cc-443c-a9e0-be5f96f3d8dd-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dcdhz\" (UID: \"a5953ed8-08cc-443c-a9e0-be5f96f3d8dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.861615 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6086abe8-3970-4d1c-9f3f-8075de87b8ec-serving-cert\") pod \"console-operator-58897d9998-z2sb2\" (UID: \"6086abe8-3970-4d1c-9f3f-8075de87b8ec\") " pod="openshift-console-operator/console-operator-58897d9998-z2sb2" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.873368 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.893127 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.913479 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.946932 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" event={"ID":"e9b67962-b520-4395-ba1a-72e0ca4e0240","Type":"ContainerStarted","Data":"7b767bc8bf2242737347dec8d3ed12e62eaeabaec8dc4c2db1bcdc1c1af7024a"} Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.946976 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" event={"ID":"e9b67962-b520-4395-ba1a-72e0ca4e0240","Type":"ContainerStarted","Data":"d878fadd9dae50157c89f0615ad38aa2ad903fbc818a866c3f70971a1dadfbe6"} Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.949167 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.965436 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.973885 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 20:07:59 crc kubenswrapper[4885]: I1205 20:07:59.993635 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.013393 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.054309 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.073988 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.093339 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.113565 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.133981 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.173381 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.193423 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.213840 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.234076 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.252826 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.273237 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.294202 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.313872 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.333911 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.353774 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.377269 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.395482 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.414048 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.434756 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.454347 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.474960 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.494499 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.516512 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.533456 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.553977 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.562782 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.575059 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.596155 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.613395 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.635470 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.655357 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.675131 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.694375 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.714602 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.735140 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.754064 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.771701 4885 request.go:700] Waited for 1.016366773s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpackageserver-service-cert&limit=500&resourceVersion=0 Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.774120 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.794234 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.814319 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.834829 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.853973 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.874304 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.894220 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.967956 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqjmb\" (UniqueName: \"kubernetes.io/projected/6e273248-d3b5-4248-9e30-06ae7c6ab889-kube-api-access-mqjmb\") pod \"apiserver-76f77b778f-wdpql\" (UID: \"6e273248-d3b5-4248-9e30-06ae7c6ab889\") " pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:08:00 crc kubenswrapper[4885]: I1205 20:08:00.985952 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t8g6\" (UniqueName: \"kubernetes.io/projected/d3b6b866-d318-454f-9730-e56de394d130-kube-api-access-4t8g6\") pod \"openshift-config-operator-7777fb866f-8z75j\" (UID: \"d3b6b866-d318-454f-9730-e56de394d130\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.006174 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8524w\" (UniqueName: \"kubernetes.io/projected/98e1f477-999e-4584-a373-c07abd3a938c-kube-api-access-8524w\") pod \"downloads-7954f5f757-jdls9\" (UID: \"98e1f477-999e-4584-a373-c07abd3a938c\") " pod="openshift-console/downloads-7954f5f757-jdls9" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.035254 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.038565 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvx5q\" (UniqueName: \"kubernetes.io/projected/875142b2-83ed-4d64-88b5-a885640981d1-kube-api-access-pvx5q\") pod \"etcd-operator-b45778765-5g9z9\" (UID: \"875142b2-83ed-4d64-88b5-a885640981d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.041636 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shw2s\" (UniqueName: \"kubernetes.io/projected/d4e08709-09d1-497f-9d79-83b90f495bbf-kube-api-access-shw2s\") pod \"machine-approver-56656f9798-6xd4d\" (UID: \"d4e08709-09d1-497f-9d79-83b90f495bbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.070920 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6986h\" (UniqueName: \"kubernetes.io/projected/543415d6-6aec-42f4-953f-3a760aefe1f2-kube-api-access-6986h\") pod \"console-f9d7485db-jdrlk\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.073862 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.093528 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.113224 4885 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.118111 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jdls9" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.141231 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.153859 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.172412 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.174579 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.223362 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.223644 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.234583 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.253907 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.259436 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.274415 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.285526 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.285751 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.285830 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.285857 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.285970 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:08:01 crc kubenswrapper[4885]: E1205 20:08:01.286182 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:10:03.286163322 +0000 UTC m=+268.582978993 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.288634 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.289792 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.290733 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.291933 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.294524 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.295196 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.313504 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.314473 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.333847 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.339215 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.352563 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jdls9"] Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.386273 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2wwv\" (UniqueName: \"kubernetes.io/projected/ac8a91ac-c2d0-40b7-aa23-cf2a0081e550-kube-api-access-c2wwv\") pod \"cluster-image-registry-operator-dc59b4c8b-mhxlk\" (UID: \"ac8a91ac-c2d0-40b7-aa23-cf2a0081e550\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.395085 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.397252 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac8a91ac-c2d0-40b7-aa23-cf2a0081e550-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mhxlk\" (UID: \"ac8a91ac-c2d0-40b7-aa23-cf2a0081e550\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.410484 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hclpm\" (UniqueName: \"kubernetes.io/projected/c9e46b72-f528-4f07-8b1e-96b98302ac86-kube-api-access-hclpm\") pod \"controller-manager-879f6c89f-xqrhh\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.411379 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8z75j"] Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.416387 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.418088 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:08:01 crc kubenswrapper[4885]: W1205 20:08:01.422703 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98e1f477_999e_4584_a373_c07abd3a938c.slice/crio-9d1fad447eed90b7367716d72824e32359f3060c8b31fee3d91a7f33c0c6305b WatchSource:0}: Error finding container 9d1fad447eed90b7367716d72824e32359f3060c8b31fee3d91a7f33c0c6305b: Status 404 returned error can't find the container with id 9d1fad447eed90b7367716d72824e32359f3060c8b31fee3d91a7f33c0c6305b Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.433359 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmrmv\" (UniqueName: \"kubernetes.io/projected/42769059-74c9-48f7-bdb2-7b97903610ba-kube-api-access-mmrmv\") pod \"openshift-controller-manager-operator-756b6f6bc6-95dk6\" (UID: \"42769059-74c9-48f7-bdb2-7b97903610ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6" Dec 05 20:08:01 crc kubenswrapper[4885]: W1205 20:08:01.442766 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3b6b866_d318_454f_9730_e56de394d130.slice/crio-31678bdfade4de8b06767b2fae6134025e9e6cc2c9d035e341494a0a356ceeac WatchSource:0}: Error finding container 31678bdfade4de8b06767b2fae6134025e9e6cc2c9d035e341494a0a356ceeac: Status 404 returned error can't find the container with id 31678bdfade4de8b06767b2fae6134025e9e6cc2c9d035e341494a0a356ceeac Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.453883 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.456718 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9tx6\" (UniqueName: \"kubernetes.io/projected/dc1ce980-9bdc-4b28-9f12-ab17b79b981c-kube-api-access-n9tx6\") pod \"apiserver-7bbb656c7d-7w97v\" (UID: \"dc1ce980-9bdc-4b28-9f12-ab17b79b981c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.474015 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.482362 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wdpql"] Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.491227 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.493792 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.513638 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.521438 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5g9z9"] Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.534657 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.554329 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 20:08:01 crc kubenswrapper[4885]: W1205 20:08:01.567682 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod875142b2_83ed_4d64_88b5_a885640981d1.slice/crio-1180e7e44d2ecb2fa93fe9f3098aac7e27f6c28a968ac221686c18bd0736e12a WatchSource:0}: Error finding container 1180e7e44d2ecb2fa93fe9f3098aac7e27f6c28a968ac221686c18bd0736e12a: Status 404 returned error can't find the container with id 1180e7e44d2ecb2fa93fe9f3098aac7e27f6c28a968ac221686c18bd0736e12a Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.574070 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.587001 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jdrlk"] Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.594511 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 20:08:01 crc kubenswrapper[4885]: W1205 20:08:01.617971 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod543415d6_6aec_42f4_953f_3a760aefe1f2.slice/crio-d959286b80431f0d2f2ea0a360c5c69d30c4ece645c4b1550f0c2521a6d077a7 WatchSource:0}: Error finding container d959286b80431f0d2f2ea0a360c5c69d30c4ece645c4b1550f0c2521a6d077a7: Status 404 returned error can't find the container with id d959286b80431f0d2f2ea0a360c5c69d30c4ece645c4b1550f0c2521a6d077a7 Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.620222 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.624397 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.629811 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.637755 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.654647 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.674412 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.698334 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.702093 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.713565 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 20:08:01 crc kubenswrapper[4885]: W1205 20:08:01.715317 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-7c0422b66510dfd1b5f010fc94136496164ab4ed5a169077f5b5c0c687849587 WatchSource:0}: Error finding container 7c0422b66510dfd1b5f010fc94136496164ab4ed5a169077f5b5c0c687849587: Status 404 returned error can't find the container with id 7c0422b66510dfd1b5f010fc94136496164ab4ed5a169077f5b5c0c687849587 Dec 05 20:08:01 crc kubenswrapper[4885]: W1205 20:08:01.715517 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-26cc9906ff784e75ecea02a9b8d0717bd0c045b313774fad7a2559dcc92d1843 WatchSource:0}: Error finding container 26cc9906ff784e75ecea02a9b8d0717bd0c045b313774fad7a2559dcc92d1843: Status 404 returned error can't find the container with id 26cc9906ff784e75ecea02a9b8d0717bd0c045b313774fad7a2559dcc92d1843 Dec 05 20:08:01 crc kubenswrapper[4885]: W1205 20:08:01.716447 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-4db8699637df7c65ac823974d48539a7f1b6f860e05bff2d52a2e211de850a42 WatchSource:0}: Error finding container 4db8699637df7c65ac823974d48539a7f1b6f860e05bff2d52a2e211de850a42: Status 404 returned error can't find the container with id 4db8699637df7c65ac823974d48539a7f1b6f860e05bff2d52a2e211de850a42 Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.733983 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.749471 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6"] Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.753481 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.772410 4885 request.go:700] Waited for 1.914993368s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/serviceaccounts/kube-storage-version-migrator-sa/token Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.797427 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq5pn\" (UniqueName: \"kubernetes.io/projected/f1540bd7-e50c-4f68-864c-a58e8c81bb03-kube-api-access-jq5pn\") pod \"migrator-59844c95c7-9n5ft\" (UID: \"f1540bd7-e50c-4f68-864c-a58e8c81bb03\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9n5ft" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.813131 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx9z6\" (UniqueName: \"kubernetes.io/projected/6086abe8-3970-4d1c-9f3f-8075de87b8ec-kube-api-access-bx9z6\") pod \"console-operator-58897d9998-z2sb2\" (UID: \"6086abe8-3970-4d1c-9f3f-8075de87b8ec\") " pod="openshift-console-operator/console-operator-58897d9998-z2sb2" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.832356 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5953ed8-08cc-443c-a9e0-be5f96f3d8dd-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dcdhz\" (UID: \"a5953ed8-08cc-443c-a9e0-be5f96f3d8dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.849670 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgv6t\" (UniqueName: \"kubernetes.io/projected/3a17b36b-0a7a-427b-9602-27aa06f15f73-kube-api-access-zgv6t\") pod \"openshift-apiserver-operator-796bbdcf4f-h9f8n\" (UID: \"3a17b36b-0a7a-427b-9602-27aa06f15f73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.893974 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.897921 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c98724fc-908e-4a61-bb2b-905c0f5709a5-trusted-ca\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.897953 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqgqx\" (UniqueName: \"kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-kube-api-access-qqgqx\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.897970 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/71071b19-6069-4981-afd0-e41274442bdb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fgdjd\" (UID: \"71071b19-6069-4981-afd0-e41274442bdb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fgdjd" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.897987 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.898036 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2973ae71-2365-4bac-ab40-9aa54317c587-metrics-tls\") pod \"ingress-operator-5b745b69d9-z7z2s\" (UID: \"2973ae71-2365-4bac-ab40-9aa54317c587\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.898052 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s887g\" (UniqueName: \"kubernetes.io/projected/2973ae71-2365-4bac-ab40-9aa54317c587-kube-api-access-s887g\") pod \"ingress-operator-5b745b69d9-z7z2s\" (UID: \"2973ae71-2365-4bac-ab40-9aa54317c587\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.898070 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.898171 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.898244 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.898267 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-registry-tls\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.898311 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.898365 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.898383 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.898399 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2k6v\" (UniqueName: \"kubernetes.io/projected/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-kube-api-access-k2k6v\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.898416 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-audit-dir\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.898452 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2973ae71-2365-4bac-ab40-9aa54317c587-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z7z2s\" (UID: \"2973ae71-2365-4bac-ab40-9aa54317c587\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.898475 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c98724fc-908e-4a61-bb2b-905c0f5709a5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:01 crc kubenswrapper[4885]: E1205 20:08:01.898566 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:02.398554631 +0000 UTC m=+147.695370292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.898959 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjrj7\" (UniqueName: \"kubernetes.io/projected/71071b19-6069-4981-afd0-e41274442bdb-kube-api-access-qjrj7\") pod \"cluster-samples-operator-665b6dd947-fgdjd\" (UID: \"71071b19-6069-4981-afd0-e41274442bdb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fgdjd" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.899394 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.899429 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3f0c2333-b88a-430a-b7a7-0882ef369aab-srv-cert\") pod \"olm-operator-6b444d44fb-6r5qr\" (UID: \"3f0c2333-b88a-430a-b7a7-0882ef369aab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.899465 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-audit-policies\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.899808 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.899851 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c98724fc-908e-4a61-bb2b-905c0f5709a5-registry-certificates\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.899872 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3f0c2333-b88a-430a-b7a7-0882ef369aab-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6r5qr\" (UID: \"3f0c2333-b88a-430a-b7a7-0882ef369aab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.900573 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c98724fc-908e-4a61-bb2b-905c0f5709a5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.900654 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2973ae71-2365-4bac-ab40-9aa54317c587-trusted-ca\") pod \"ingress-operator-5b745b69d9-z7z2s\" (UID: \"2973ae71-2365-4bac-ab40-9aa54317c587\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.900922 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.900942 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4758\" (UniqueName: \"kubernetes.io/projected/3f0c2333-b88a-430a-b7a7-0882ef369aab-kube-api-access-r4758\") pod \"olm-operator-6b444d44fb-6r5qr\" (UID: \"3f0c2333-b88a-430a-b7a7-0882ef369aab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.900976 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa341b17-db5b-487c-9a0a-446a1842a78e-metrics-tls\") pod \"dns-operator-744455d44c-x67dc\" (UID: \"fa341b17-db5b-487c-9a0a-446a1842a78e\") " pod="openshift-dns-operator/dns-operator-744455d44c-x67dc" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.901145 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-bound-sa-token\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.901169 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.901198 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.901220 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc8nb\" (UniqueName: \"kubernetes.io/projected/fa341b17-db5b-487c-9a0a-446a1842a78e-kube-api-access-pc8nb\") pod \"dns-operator-744455d44c-x67dc\" (UID: \"fa341b17-db5b-487c-9a0a-446a1842a78e\") " pod="openshift-dns-operator/dns-operator-744455d44c-x67dc" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.903405 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67bs2\" (UniqueName: \"kubernetes.io/projected/34f5add1-5763-4b13-8058-e1b6fbbb4740-kube-api-access-67bs2\") pod \"route-controller-manager-6576b87f9c-sp659\" (UID: \"34f5add1-5763-4b13-8058-e1b6fbbb4740\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.937913 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk"] Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.956204 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.959607 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wdpql" event={"ID":"6e273248-d3b5-4248-9e30-06ae7c6ab889","Type":"ContainerStarted","Data":"ce4efd6f0c8810f16ebd7755416bfa63b2ccc78d06434a3643643f559e97d30e"} Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.959657 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wdpql" event={"ID":"6e273248-d3b5-4248-9e30-06ae7c6ab889","Type":"ContainerStarted","Data":"81002a37e3db4361afc240c17024da02dacf2425d2395bd4d2e389bdf067a266"} Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.960818 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4db8699637df7c65ac823974d48539a7f1b6f860e05bff2d52a2e211de850a42"} Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.961756 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-z2sb2" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.962584 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" event={"ID":"875142b2-83ed-4d64-88b5-a885640981d1","Type":"ContainerStarted","Data":"1180e7e44d2ecb2fa93fe9f3098aac7e27f6c28a968ac221686c18bd0736e12a"} Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.964102 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jdls9" event={"ID":"98e1f477-999e-4584-a373-c07abd3a938c","Type":"ContainerStarted","Data":"eb1e0ba418b3fedbabe7b305cea29bbeabc63bf607ca7c01f2d86bd1687be36d"} Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.964161 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jdls9" event={"ID":"98e1f477-999e-4584-a373-c07abd3a938c","Type":"ContainerStarted","Data":"9d1fad447eed90b7367716d72824e32359f3060c8b31fee3d91a7f33c0c6305b"} Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.964951 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-jdls9" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.967867 4885 patch_prober.go:28] interesting pod/downloads-7954f5f757-jdls9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.967905 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jdls9" podUID="98e1f477-999e-4584-a373-c07abd3a938c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.968581 4885 generic.go:334] "Generic (PLEG): container finished" podID="d3b6b866-d318-454f-9730-e56de394d130" containerID="472b2eaadce45c8e3bd820a8f253311ac2bf5df692141e0926d5a4c226bedb45" exitCode=0 Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.968764 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" event={"ID":"d3b6b866-d318-454f-9730-e56de394d130","Type":"ContainerDied","Data":"472b2eaadce45c8e3bd820a8f253311ac2bf5df692141e0926d5a4c226bedb45"} Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.968798 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" event={"ID":"d3b6b866-d318-454f-9730-e56de394d130","Type":"ContainerStarted","Data":"31678bdfade4de8b06767b2fae6134025e9e6cc2c9d035e341494a0a356ceeac"} Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.970974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"dad20645defd4f1281dd6973fd55707d1638b3257aef22a68fa83c8507c246ad"} Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.971044 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7c0422b66510dfd1b5f010fc94136496164ab4ed5a169077f5b5c0c687849587"} Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.971221 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.973766 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" event={"ID":"d4e08709-09d1-497f-9d79-83b90f495bbf","Type":"ContainerStarted","Data":"1d53ca0ab5bcaa1c4b1ab7f2e79cdf0e64227238c5f8093d4303520ff5a2f3b2"} Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.973797 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" event={"ID":"d4e08709-09d1-497f-9d79-83b90f495bbf","Type":"ContainerStarted","Data":"7ce54027b0bab0e731e2150204b3ae465b485b38ef2d277ad0c5f2b7f0dbf533"} Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.977776 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jdrlk" event={"ID":"543415d6-6aec-42f4-953f-3a760aefe1f2","Type":"ContainerStarted","Data":"aafdaefedfa89069116c7ab58f2ef94860dbf82afd838806c9388a9f47f0f829"} Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.977809 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jdrlk" event={"ID":"543415d6-6aec-42f4-953f-3a760aefe1f2","Type":"ContainerStarted","Data":"d959286b80431f0d2f2ea0a360c5c69d30c4ece645c4b1550f0c2521a6d077a7"} Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.990101 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz" Dec 05 20:08:01 crc kubenswrapper[4885]: I1205 20:08:01.990756 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6" event={"ID":"42769059-74c9-48f7-bdb2-7b97903610ba","Type":"ContainerStarted","Data":"68decec662cf02c26bb563470c912d06ef1e391806f154defecc95f9d147e89c"} Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.000166 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2606079cb2c5f5c7d9789bd8fc2a5c5f364f5decb269792a1bc6f4ff0d2fe318"} Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.000344 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"26cc9906ff784e75ecea02a9b8d0717bd0c045b313774fad7a2559dcc92d1843"} Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.004702 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005323 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005356 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2k6v\" (UniqueName: \"kubernetes.io/projected/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-kube-api-access-k2k6v\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005403 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-audit-dir\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005457 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/106ffd61-239f-4707-b999-aa044f6f30ae-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n7qfd\" (UID: \"106ffd61-239f-4707-b999-aa044f6f30ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005554 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/67f2c73f-2335-4f37-86ba-0dec25c93c9e-srv-cert\") pod \"catalog-operator-68c6474976-mjg4d\" (UID: \"67f2c73f-2335-4f37-86ba-0dec25c93c9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005583 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5hqk\" (UniqueName: \"kubernetes.io/projected/f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde-kube-api-access-j5hqk\") pod \"router-default-5444994796-89b4n\" (UID: \"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde\") " pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005604 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-config-volume\") pod \"collect-profiles-29416080-95j7p\" (UID: \"3e2c6d12-1e18-498c-82e4-9c778e7c4aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005624 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde-service-ca-bundle\") pod \"router-default-5444994796-89b4n\" (UID: \"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde\") " pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005645 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b74d087-5bad-4236-bad9-9808abed29e2-metrics-tls\") pod \"dns-default-mfhrt\" (UID: \"0b74d087-5bad-4236-bad9-9808abed29e2\") " pod="openshift-dns/dns-default-mfhrt" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005664 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b6be43bc-4f78-415c-abf8-c18e9bb5e21c-signing-cabundle\") pod \"service-ca-9c57cc56f-x24g6\" (UID: \"b6be43bc-4f78-415c-abf8-c18e9bb5e21c\") " pod="openshift-service-ca/service-ca-9c57cc56f-x24g6" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005708 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7f4b9f7-b478-4e60-a533-67a7ab786f86-proxy-tls\") pod \"machine-config-operator-74547568cd-kggxw\" (UID: \"e7f4b9f7-b478-4e60-a533-67a7ab786f86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005764 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005784 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3f0c2333-b88a-430a-b7a7-0882ef369aab-srv-cert\") pod \"olm-operator-6b444d44fb-6r5qr\" (UID: \"3f0c2333-b88a-430a-b7a7-0882ef369aab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005806 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wcv5\" (UniqueName: \"kubernetes.io/projected/0b74d087-5bad-4236-bad9-9808abed29e2-kube-api-access-4wcv5\") pod \"dns-default-mfhrt\" (UID: \"0b74d087-5bad-4236-bad9-9808abed29e2\") " pod="openshift-dns/dns-default-mfhrt" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005827 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-audit-policies\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005846 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c8d8e49-9ca8-425b-ac37-9409980c4ff7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8qk5x\" (UID: \"0c8d8e49-9ca8-425b-ac37-9409980c4ff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005902 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3f0c2333-b88a-430a-b7a7-0882ef369aab-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6r5qr\" (UID: \"3f0c2333-b88a-430a-b7a7-0882ef369aab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005933 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nndmw\" (UniqueName: \"kubernetes.io/projected/1ad3cb2f-89ef-4f6e-9d48-f3eb33e4581c-kube-api-access-nndmw\") pod \"control-plane-machine-set-operator-78cbb6b69f-hfsls\" (UID: \"1ad3cb2f-89ef-4f6e-9d48-f3eb33e4581c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hfsls" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005962 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/61b3a667-550a-4955-91cc-8cd14d2a77c0-node-bootstrap-token\") pod \"machine-config-server-h6jst\" (UID: \"61b3a667-550a-4955-91cc-8cd14d2a77c0\") " pod="openshift-machine-config-operator/machine-config-server-h6jst" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005979 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjnlf\" (UniqueName: \"kubernetes.io/projected/bf46954d-2487-49ae-92ab-38c47c77c9c2-kube-api-access-bjnlf\") pod \"packageserver-d55dfcdfc-pcn44\" (UID: \"bf46954d-2487-49ae-92ab-38c47c77c9c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.005999 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbtfh\" (UniqueName: \"kubernetes.io/projected/479aaab8-5f0c-4c81-9f9e-60b3ca4e2ec0-kube-api-access-bbtfh\") pod \"ingress-canary-8tph2\" (UID: \"479aaab8-5f0c-4c81-9f9e-60b3ca4e2ec0\") " pod="openshift-ingress-canary/ingress-canary-8tph2" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006084 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2973ae71-2365-4bac-ab40-9aa54317c587-trusted-ca\") pod \"ingress-operator-5b745b69d9-z7z2s\" (UID: \"2973ae71-2365-4bac-ab40-9aa54317c587\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006125 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f395140e-7a1c-45ce-8eab-9d11bf757838-csi-data-dir\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006146 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5v8t\" (UniqueName: \"kubernetes.io/projected/5cf77011-0fae-4535-8564-b6393dfe49bb-kube-api-access-d5v8t\") pod \"service-ca-operator-777779d784-nznv9\" (UID: \"5cf77011-0fae-4535-8564-b6393dfe49bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nznv9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006169 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4758\" (UniqueName: \"kubernetes.io/projected/3f0c2333-b88a-430a-b7a7-0882ef369aab-kube-api-access-r4758\") pod \"olm-operator-6b444d44fb-6r5qr\" (UID: \"3f0c2333-b88a-430a-b7a7-0882ef369aab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006189 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxdwp\" (UniqueName: \"kubernetes.io/projected/3e2298f3-9cb9-4307-b3ab-1e9eb349788b-kube-api-access-fxdwp\") pod \"multus-admission-controller-857f4d67dd-c9jnh\" (UID: \"3e2298f3-9cb9-4307-b3ab-1e9eb349788b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c9jnh" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006205 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e7f4b9f7-b478-4e60-a533-67a7ab786f86-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kggxw\" (UID: \"e7f4b9f7-b478-4e60-a533-67a7ab786f86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006226 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgg5n\" (UniqueName: \"kubernetes.io/projected/67f2c73f-2335-4f37-86ba-0dec25c93c9e-kube-api-access-sgg5n\") pod \"catalog-operator-68c6474976-mjg4d\" (UID: \"67f2c73f-2335-4f37-86ba-0dec25c93c9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006245 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde-metrics-certs\") pod \"router-default-5444994796-89b4n\" (UID: \"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde\") " pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006266 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/54e1caa8-222c-4e43-a2e5-c38cd995eaf1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xwpfl\" (UID: \"54e1caa8-222c-4e43-a2e5-c38cd995eaf1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006308 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/24653880-7b0f-4174-ac74-5d13d99975e9-images\") pod \"machine-api-operator-5694c8668f-vs7jr\" (UID: \"24653880-7b0f-4174-ac74-5d13d99975e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006329 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ad3cb2f-89ef-4f6e-9d48-f3eb33e4581c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hfsls\" (UID: \"1ad3cb2f-89ef-4f6e-9d48-f3eb33e4581c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hfsls" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006363 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c98724fc-908e-4a61-bb2b-905c0f5709a5-trusted-ca\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006381 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm8vv\" (UniqueName: \"kubernetes.io/projected/61b3a667-550a-4955-91cc-8cd14d2a77c0-kube-api-access-sm8vv\") pod \"machine-config-server-h6jst\" (UID: \"61b3a667-550a-4955-91cc-8cd14d2a77c0\") " pod="openshift-machine-config-operator/machine-config-server-h6jst" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006405 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/71071b19-6069-4981-afd0-e41274442bdb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fgdjd\" (UID: \"71071b19-6069-4981-afd0-e41274442bdb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fgdjd" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006450 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006473 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blkxl\" (UniqueName: \"kubernetes.io/projected/b6be43bc-4f78-415c-abf8-c18e9bb5e21c-kube-api-access-blkxl\") pod \"service-ca-9c57cc56f-x24g6\" (UID: \"b6be43bc-4f78-415c-abf8-c18e9bb5e21c\") " pod="openshift-service-ca/service-ca-9c57cc56f-x24g6" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006493 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whvlq\" (UniqueName: \"kubernetes.io/projected/24653880-7b0f-4174-ac74-5d13d99975e9-kube-api-access-whvlq\") pod \"machine-api-operator-5694c8668f-vs7jr\" (UID: \"24653880-7b0f-4174-ac74-5d13d99975e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006510 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12a1a6a0-cbae-49a3-8ca6-f015d67a70cd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-94qs9\" (UID: \"12a1a6a0-cbae-49a3-8ca6-f015d67a70cd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006532 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s887g\" (UniqueName: \"kubernetes.io/projected/2973ae71-2365-4bac-ab40-9aa54317c587-kube-api-access-s887g\") pod \"ingress-operator-5b745b69d9-z7z2s\" (UID: \"2973ae71-2365-4bac-ab40-9aa54317c587\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006552 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f5d3875-9256-494d-a41d-5f4011c4462d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv5vb\" (UID: \"3f5d3875-9256-494d-a41d-5f4011c4462d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006616 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006639 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e088d04-9b0c-45d3-8b98-d97b4065418c-proxy-tls\") pod \"machine-config-controller-84d6567774-45rpq\" (UID: \"5e088d04-9b0c-45d3-8b98-d97b4065418c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006661 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bf46954d-2487-49ae-92ab-38c47c77c9c2-tmpfs\") pod \"packageserver-d55dfcdfc-pcn44\" (UID: \"bf46954d-2487-49ae-92ab-38c47c77c9c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006679 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f5d3875-9256-494d-a41d-5f4011c4462d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv5vb\" (UID: \"3f5d3875-9256-494d-a41d-5f4011c4462d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006711 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-secret-volume\") pod \"collect-profiles-29416080-95j7p\" (UID: \"3e2c6d12-1e18-498c-82e4-9c778e7c4aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006738 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjpnn\" (UniqueName: \"kubernetes.io/projected/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-kube-api-access-rjpnn\") pod \"collect-profiles-29416080-95j7p\" (UID: \"3e2c6d12-1e18-498c-82e4-9c778e7c4aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006794 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5e088d04-9b0c-45d3-8b98-d97b4065418c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-45rpq\" (UID: \"5e088d04-9b0c-45d3-8b98-d97b4065418c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006832 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24653880-7b0f-4174-ac74-5d13d99975e9-config\") pod \"machine-api-operator-5694c8668f-vs7jr\" (UID: \"24653880-7b0f-4174-ac74-5d13d99975e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006966 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.006994 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2973ae71-2365-4bac-ab40-9aa54317c587-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z7z2s\" (UID: \"2973ae71-2365-4bac-ab40-9aa54317c587\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.007063 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a1a6a0-cbae-49a3-8ca6-f015d67a70cd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-94qs9\" (UID: \"12a1a6a0-cbae-49a3-8ca6-f015d67a70cd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.007099 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c98724fc-908e-4a61-bb2b-905c0f5709a5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.007116 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f395140e-7a1c-45ce-8eab-9d11bf757838-mountpoint-dir\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.007138 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjrj7\" (UniqueName: \"kubernetes.io/projected/71071b19-6069-4981-afd0-e41274442bdb-kube-api-access-qjrj7\") pod \"cluster-samples-operator-665b6dd947-fgdjd\" (UID: \"71071b19-6069-4981-afd0-e41274442bdb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fgdjd" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.007157 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cf77011-0fae-4535-8564-b6393dfe49bb-serving-cert\") pod \"service-ca-operator-777779d784-nznv9\" (UID: \"5cf77011-0fae-4535-8564-b6393dfe49bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nznv9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.007209 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/479aaab8-5f0c-4c81-9f9e-60b3ca4e2ec0-cert\") pod \"ingress-canary-8tph2\" (UID: \"479aaab8-5f0c-4c81-9f9e-60b3ca4e2ec0\") " pod="openshift-ingress-canary/ingress-canary-8tph2" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.007234 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f5d3875-9256-494d-a41d-5f4011c4462d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv5vb\" (UID: \"3f5d3875-9256-494d-a41d-5f4011c4462d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.007256 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b74d087-5bad-4236-bad9-9808abed29e2-config-volume\") pod \"dns-default-mfhrt\" (UID: \"0b74d087-5bad-4236-bad9-9808abed29e2\") " pod="openshift-dns/dns-default-mfhrt" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.007302 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.009611 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c98724fc-908e-4a61-bb2b-905c0f5709a5-registry-certificates\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.010798 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c98724fc-908e-4a61-bb2b-905c0f5709a5-registry-certificates\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.010858 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde-default-certificate\") pod \"router-default-5444994796-89b4n\" (UID: \"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde\") " pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.010891 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/24653880-7b0f-4174-ac74-5d13d99975e9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vs7jr\" (UID: \"24653880-7b0f-4174-ac74-5d13d99975e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.010922 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/61b3a667-550a-4955-91cc-8cd14d2a77c0-certs\") pod \"machine-config-server-h6jst\" (UID: \"61b3a667-550a-4955-91cc-8cd14d2a77c0\") " pod="openshift-machine-config-operator/machine-config-server-h6jst" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.010970 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c98724fc-908e-4a61-bb2b-905c0f5709a5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.010999 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk28r\" (UniqueName: \"kubernetes.io/projected/f395140e-7a1c-45ce-8eab-9d11bf757838-kube-api-access-kk28r\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.011045 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl9nk\" (UniqueName: \"kubernetes.io/projected/106ffd61-239f-4707-b999-aa044f6f30ae-kube-api-access-fl9nk\") pod \"marketplace-operator-79b997595-n7qfd\" (UID: \"106ffd61-239f-4707-b999-aa044f6f30ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.011075 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f395140e-7a1c-45ce-8eab-9d11bf757838-registration-dir\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.011149 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.011176 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bf46954d-2487-49ae-92ab-38c47c77c9c2-webhook-cert\") pod \"packageserver-d55dfcdfc-pcn44\" (UID: \"bf46954d-2487-49ae-92ab-38c47c77c9c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.011205 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh99j\" (UniqueName: \"kubernetes.io/projected/54e1caa8-222c-4e43-a2e5-c38cd995eaf1-kube-api-access-nh99j\") pod \"package-server-manager-789f6589d5-xwpfl\" (UID: \"54e1caa8-222c-4e43-a2e5-c38cd995eaf1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.011234 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwwzg\" (UniqueName: \"kubernetes.io/projected/12a1a6a0-cbae-49a3-8ca6-f015d67a70cd-kube-api-access-mwwzg\") pod \"kube-storage-version-migrator-operator-b67b599dd-94qs9\" (UID: \"12a1a6a0-cbae-49a3-8ca6-f015d67a70cd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.011267 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa341b17-db5b-487c-9a0a-446a1842a78e-metrics-tls\") pod \"dns-operator-744455d44c-x67dc\" (UID: \"fa341b17-db5b-487c-9a0a-446a1842a78e\") " pod="openshift-dns-operator/dns-operator-744455d44c-x67dc" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.011290 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f395140e-7a1c-45ce-8eab-9d11bf757838-plugins-dir\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.011222 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2973ae71-2365-4bac-ab40-9aa54317c587-trusted-ca\") pod \"ingress-operator-5b745b69d9-z7z2s\" (UID: \"2973ae71-2365-4bac-ab40-9aa54317c587\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.013949 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3f0c2333-b88a-430a-b7a7-0882ef369aab-srv-cert\") pod \"olm-operator-6b444d44fb-6r5qr\" (UID: \"3f0c2333-b88a-430a-b7a7-0882ef369aab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.017431 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.018214 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.018561 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xqrhh"] Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.019734 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.019786 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc8nb\" (UniqueName: \"kubernetes.io/projected/fa341b17-db5b-487c-9a0a-446a1842a78e-kube-api-access-pc8nb\") pod \"dns-operator-744455d44c-x67dc\" (UID: \"fa341b17-db5b-487c-9a0a-446a1842a78e\") " pod="openshift-dns-operator/dns-operator-744455d44c-x67dc" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.019809 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b6be43bc-4f78-415c-abf8-c18e9bb5e21c-signing-key\") pod \"service-ca-9c57cc56f-x24g6\" (UID: \"b6be43bc-4f78-415c-abf8-c18e9bb5e21c\") " pod="openshift-service-ca/service-ca-9c57cc56f-x24g6" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.019877 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8jpp\" (UniqueName: \"kubernetes.io/projected/e7f4b9f7-b478-4e60-a533-67a7ab786f86-kube-api-access-x8jpp\") pod \"machine-config-operator-74547568cd-kggxw\" (UID: \"e7f4b9f7-b478-4e60-a533-67a7ab786f86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.019928 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/106ffd61-239f-4707-b999-aa044f6f30ae-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n7qfd\" (UID: \"106ffd61-239f-4707-b999-aa044f6f30ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.019952 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-bound-sa-token\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.019974 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c8d8e49-9ca8-425b-ac37-9409980c4ff7-config\") pod \"kube-controller-manager-operator-78b949d7b-8qk5x\" (UID: \"0c8d8e49-9ca8-425b-ac37-9409980c4ff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.020054 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqgqx\" (UniqueName: \"kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-kube-api-access-qqgqx\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.020114 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2973ae71-2365-4bac-ab40-9aa54317c587-metrics-tls\") pod \"ingress-operator-5b745b69d9-z7z2s\" (UID: \"2973ae71-2365-4bac-ab40-9aa54317c587\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.020139 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.020169 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmvfr\" (UniqueName: \"kubernetes.io/projected/5e088d04-9b0c-45d3-8b98-d97b4065418c-kube-api-access-hmvfr\") pod \"machine-config-controller-84d6567774-45rpq\" (UID: \"5e088d04-9b0c-45d3-8b98-d97b4065418c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.020232 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f395140e-7a1c-45ce-8eab-9d11bf757838-socket-dir\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.020294 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/67f2c73f-2335-4f37-86ba-0dec25c93c9e-profile-collector-cert\") pod \"catalog-operator-68c6474976-mjg4d\" (UID: \"67f2c73f-2335-4f37-86ba-0dec25c93c9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.020979 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-registry-tls\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.021012 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e7f4b9f7-b478-4e60-a533-67a7ab786f86-images\") pod \"machine-config-operator-74547568cd-kggxw\" (UID: \"e7f4b9f7-b478-4e60-a533-67a7ab786f86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.021415 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa341b17-db5b-487c-9a0a-446a1842a78e-metrics-tls\") pod \"dns-operator-744455d44c-x67dc\" (UID: \"fa341b17-db5b-487c-9a0a-446a1842a78e\") " pod="openshift-dns-operator/dns-operator-744455d44c-x67dc" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.021521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-audit-dir\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.021715 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9n5ft" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.022381 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c98724fc-908e-4a61-bb2b-905c0f5709a5-trusted-ca\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.027518 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.028375 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3f0c2333-b88a-430a-b7a7-0882ef369aab-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6r5qr\" (UID: \"3f0c2333-b88a-430a-b7a7-0882ef369aab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.028811 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.029758 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3e2298f3-9cb9-4307-b3ab-1e9eb349788b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c9jnh\" (UID: \"3e2298f3-9cb9-4307-b3ab-1e9eb349788b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c9jnh" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.029927 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cf77011-0fae-4535-8564-b6393dfe49bb-config\") pod \"service-ca-operator-777779d784-nznv9\" (UID: \"5cf77011-0fae-4535-8564-b6393dfe49bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nznv9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.030097 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bf46954d-2487-49ae-92ab-38c47c77c9c2-apiservice-cert\") pod \"packageserver-d55dfcdfc-pcn44\" (UID: \"bf46954d-2487-49ae-92ab-38c47c77c9c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" Dec 05 20:08:02 crc kubenswrapper[4885]: E1205 20:08:02.030293 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:02.530256884 +0000 UTC m=+147.827072545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.030437 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde-stats-auth\") pod \"router-default-5444994796-89b4n\" (UID: \"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde\") " pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.030558 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c8d8e49-9ca8-425b-ac37-9409980c4ff7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8qk5x\" (UID: \"0c8d8e49-9ca8-425b-ac37-9409980c4ff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.035029 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c98724fc-908e-4a61-bb2b-905c0f5709a5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.035298 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c98724fc-908e-4a61-bb2b-905c0f5709a5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.038804 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.039181 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.041390 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.041866 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.043640 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.044065 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.044731 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-registry-tls\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.044974 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.045284 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.045323 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/71071b19-6069-4981-afd0-e41274442bdb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fgdjd\" (UID: \"71071b19-6069-4981-afd0-e41274442bdb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fgdjd" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.045790 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-audit-policies\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.046007 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: W1205 20:08:02.046553 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9e46b72_f528_4f07_8b1e_96b98302ac86.slice/crio-56375deacdad25513422b9c51bab324fb3d7e97f288385e871f5a764e6acdc4e WatchSource:0}: Error finding container 56375deacdad25513422b9c51bab324fb3d7e97f288385e871f5a764e6acdc4e: Status 404 returned error can't find the container with id 56375deacdad25513422b9c51bab324fb3d7e97f288385e871f5a764e6acdc4e Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.046970 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2973ae71-2365-4bac-ab40-9aa54317c587-metrics-tls\") pod \"ingress-operator-5b745b69d9-z7z2s\" (UID: \"2973ae71-2365-4bac-ab40-9aa54317c587\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.050260 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4758\" (UniqueName: \"kubernetes.io/projected/3f0c2333-b88a-430a-b7a7-0882ef369aab-kube-api-access-r4758\") pod \"olm-operator-6b444d44fb-6r5qr\" (UID: \"3f0c2333-b88a-430a-b7a7-0882ef369aab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.068328 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2973ae71-2365-4bac-ab40-9aa54317c587-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z7z2s\" (UID: \"2973ae71-2365-4bac-ab40-9aa54317c587\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.092207 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s887g\" (UniqueName: \"kubernetes.io/projected/2973ae71-2365-4bac-ab40-9aa54317c587-kube-api-access-s887g\") pod \"ingress-operator-5b745b69d9-z7z2s\" (UID: \"2973ae71-2365-4bac-ab40-9aa54317c587\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.104008 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.108181 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjrj7\" (UniqueName: \"kubernetes.io/projected/71071b19-6069-4981-afd0-e41274442bdb-kube-api-access-qjrj7\") pod \"cluster-samples-operator-665b6dd947-fgdjd\" (UID: \"71071b19-6069-4981-afd0-e41274442bdb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fgdjd" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.134205 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24653880-7b0f-4174-ac74-5d13d99975e9-config\") pod \"machine-api-operator-5694c8668f-vs7jr\" (UID: \"24653880-7b0f-4174-ac74-5d13d99975e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.134275 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5e088d04-9b0c-45d3-8b98-d97b4065418c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-45rpq\" (UID: \"5e088d04-9b0c-45d3-8b98-d97b4065418c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.134347 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a1a6a0-cbae-49a3-8ca6-f015d67a70cd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-94qs9\" (UID: \"12a1a6a0-cbae-49a3-8ca6-f015d67a70cd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.134373 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f395140e-7a1c-45ce-8eab-9d11bf757838-mountpoint-dir\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.134722 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cf77011-0fae-4535-8564-b6393dfe49bb-serving-cert\") pod \"service-ca-operator-777779d784-nznv9\" (UID: \"5cf77011-0fae-4535-8564-b6393dfe49bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nznv9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.134760 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/479aaab8-5f0c-4c81-9f9e-60b3ca4e2ec0-cert\") pod \"ingress-canary-8tph2\" (UID: \"479aaab8-5f0c-4c81-9f9e-60b3ca4e2ec0\") " pod="openshift-ingress-canary/ingress-canary-8tph2" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.134990 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f5d3875-9256-494d-a41d-5f4011c4462d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv5vb\" (UID: \"3f5d3875-9256-494d-a41d-5f4011c4462d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.135058 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b74d087-5bad-4236-bad9-9808abed29e2-config-volume\") pod \"dns-default-mfhrt\" (UID: \"0b74d087-5bad-4236-bad9-9808abed29e2\") " pod="openshift-dns/dns-default-mfhrt" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.135091 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde-default-certificate\") pod \"router-default-5444994796-89b4n\" (UID: \"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde\") " pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.135788 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a1a6a0-cbae-49a3-8ca6-f015d67a70cd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-94qs9\" (UID: \"12a1a6a0-cbae-49a3-8ca6-f015d67a70cd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.135833 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/24653880-7b0f-4174-ac74-5d13d99975e9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vs7jr\" (UID: \"24653880-7b0f-4174-ac74-5d13d99975e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.135857 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/61b3a667-550a-4955-91cc-8cd14d2a77c0-certs\") pod \"machine-config-server-h6jst\" (UID: \"61b3a667-550a-4955-91cc-8cd14d2a77c0\") " pod="openshift-machine-config-operator/machine-config-server-h6jst" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.135882 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk28r\" (UniqueName: \"kubernetes.io/projected/f395140e-7a1c-45ce-8eab-9d11bf757838-kube-api-access-kk28r\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.135916 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl9nk\" (UniqueName: \"kubernetes.io/projected/106ffd61-239f-4707-b999-aa044f6f30ae-kube-api-access-fl9nk\") pod \"marketplace-operator-79b997595-n7qfd\" (UID: \"106ffd61-239f-4707-b999-aa044f6f30ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.135938 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f395140e-7a1c-45ce-8eab-9d11bf757838-registration-dir\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.135963 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bf46954d-2487-49ae-92ab-38c47c77c9c2-webhook-cert\") pod \"packageserver-d55dfcdfc-pcn44\" (UID: \"bf46954d-2487-49ae-92ab-38c47c77c9c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.135984 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh99j\" (UniqueName: \"kubernetes.io/projected/54e1caa8-222c-4e43-a2e5-c38cd995eaf1-kube-api-access-nh99j\") pod \"package-server-manager-789f6589d5-xwpfl\" (UID: \"54e1caa8-222c-4e43-a2e5-c38cd995eaf1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.136007 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwwzg\" (UniqueName: \"kubernetes.io/projected/12a1a6a0-cbae-49a3-8ca6-f015d67a70cd-kube-api-access-mwwzg\") pod \"kube-storage-version-migrator-operator-b67b599dd-94qs9\" (UID: \"12a1a6a0-cbae-49a3-8ca6-f015d67a70cd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.136060 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f395140e-7a1c-45ce-8eab-9d11bf757838-plugins-dir\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.136085 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/106ffd61-239f-4707-b999-aa044f6f30ae-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n7qfd\" (UID: \"106ffd61-239f-4707-b999-aa044f6f30ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.136121 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b6be43bc-4f78-415c-abf8-c18e9bb5e21c-signing-key\") pod \"service-ca-9c57cc56f-x24g6\" (UID: \"b6be43bc-4f78-415c-abf8-c18e9bb5e21c\") " pod="openshift-service-ca/service-ca-9c57cc56f-x24g6" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.136161 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8jpp\" (UniqueName: \"kubernetes.io/projected/e7f4b9f7-b478-4e60-a533-67a7ab786f86-kube-api-access-x8jpp\") pod \"machine-config-operator-74547568cd-kggxw\" (UID: \"e7f4b9f7-b478-4e60-a533-67a7ab786f86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.136183 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c8d8e49-9ca8-425b-ac37-9409980c4ff7-config\") pod \"kube-controller-manager-operator-78b949d7b-8qk5x\" (UID: \"0c8d8e49-9ca8-425b-ac37-9409980c4ff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.136211 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmvfr\" (UniqueName: \"kubernetes.io/projected/5e088d04-9b0c-45d3-8b98-d97b4065418c-kube-api-access-hmvfr\") pod \"machine-config-controller-84d6567774-45rpq\" (UID: \"5e088d04-9b0c-45d3-8b98-d97b4065418c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.136235 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f395140e-7a1c-45ce-8eab-9d11bf757838-socket-dir\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.136292 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/67f2c73f-2335-4f37-86ba-0dec25c93c9e-profile-collector-cert\") pod \"catalog-operator-68c6474976-mjg4d\" (UID: \"67f2c73f-2335-4f37-86ba-0dec25c93c9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.136325 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.136348 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e7f4b9f7-b478-4e60-a533-67a7ab786f86-images\") pod \"machine-config-operator-74547568cd-kggxw\" (UID: \"e7f4b9f7-b478-4e60-a533-67a7ab786f86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.136372 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3e2298f3-9cb9-4307-b3ab-1e9eb349788b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c9jnh\" (UID: \"3e2298f3-9cb9-4307-b3ab-1e9eb349788b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c9jnh" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.136396 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cf77011-0fae-4535-8564-b6393dfe49bb-config\") pod \"service-ca-operator-777779d784-nznv9\" (UID: \"5cf77011-0fae-4535-8564-b6393dfe49bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nznv9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.136416 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bf46954d-2487-49ae-92ab-38c47c77c9c2-apiservice-cert\") pod \"packageserver-d55dfcdfc-pcn44\" (UID: \"bf46954d-2487-49ae-92ab-38c47c77c9c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.136440 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde-stats-auth\") pod \"router-default-5444994796-89b4n\" (UID: \"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde\") " pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.138702 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b74d087-5bad-4236-bad9-9808abed29e2-config-volume\") pod \"dns-default-mfhrt\" (UID: \"0b74d087-5bad-4236-bad9-9808abed29e2\") " pod="openshift-dns/dns-default-mfhrt" Dec 05 20:08:02 crc kubenswrapper[4885]: E1205 20:08:02.138897 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:02.638884278 +0000 UTC m=+147.935699949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.139379 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cf77011-0fae-4535-8564-b6393dfe49bb-config\") pod \"service-ca-operator-777779d784-nznv9\" (UID: \"5cf77011-0fae-4535-8564-b6393dfe49bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nznv9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.139695 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2k6v\" (UniqueName: \"kubernetes.io/projected/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-kube-api-access-k2k6v\") pod \"oauth-openshift-558db77b4-qcd9b\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.136328 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f395140e-7a1c-45ce-8eab-9d11bf757838-mountpoint-dir\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.139956 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f395140e-7a1c-45ce-8eab-9d11bf757838-registration-dir\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.140262 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e7f4b9f7-b478-4e60-a533-67a7ab786f86-images\") pod \"machine-config-operator-74547568cd-kggxw\" (UID: \"e7f4b9f7-b478-4e60-a533-67a7ab786f86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.140499 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c8d8e49-9ca8-425b-ac37-9409980c4ff7-config\") pod \"kube-controller-manager-operator-78b949d7b-8qk5x\" (UID: \"0c8d8e49-9ca8-425b-ac37-9409980c4ff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.141596 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f395140e-7a1c-45ce-8eab-9d11bf757838-socket-dir\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.141929 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cf77011-0fae-4535-8564-b6393dfe49bb-serving-cert\") pod \"service-ca-operator-777779d784-nznv9\" (UID: \"5cf77011-0fae-4535-8564-b6393dfe49bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nznv9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.142952 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bf46954d-2487-49ae-92ab-38c47c77c9c2-webhook-cert\") pod \"packageserver-d55dfcdfc-pcn44\" (UID: \"bf46954d-2487-49ae-92ab-38c47c77c9c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.143750 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/61b3a667-550a-4955-91cc-8cd14d2a77c0-certs\") pod \"machine-config-server-h6jst\" (UID: \"61b3a667-550a-4955-91cc-8cd14d2a77c0\") " pod="openshift-machine-config-operator/machine-config-server-h6jst" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.144428 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/479aaab8-5f0c-4c81-9f9e-60b3ca4e2ec0-cert\") pod \"ingress-canary-8tph2\" (UID: \"479aaab8-5f0c-4c81-9f9e-60b3ca4e2ec0\") " pod="openshift-ingress-canary/ingress-canary-8tph2" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.144642 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3e2298f3-9cb9-4307-b3ab-1e9eb349788b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c9jnh\" (UID: \"3e2298f3-9cb9-4307-b3ab-1e9eb349788b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c9jnh" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.145294 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/106ffd61-239f-4707-b999-aa044f6f30ae-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n7qfd\" (UID: \"106ffd61-239f-4707-b999-aa044f6f30ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.145368 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c8d8e49-9ca8-425b-ac37-9409980c4ff7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8qk5x\" (UID: \"0c8d8e49-9ca8-425b-ac37-9409980c4ff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.145428 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/106ffd61-239f-4707-b999-aa044f6f30ae-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n7qfd\" (UID: \"106ffd61-239f-4707-b999-aa044f6f30ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.145455 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-config-volume\") pod \"collect-profiles-29416080-95j7p\" (UID: \"3e2c6d12-1e18-498c-82e4-9c778e7c4aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.145482 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/67f2c73f-2335-4f37-86ba-0dec25c93c9e-srv-cert\") pod \"catalog-operator-68c6474976-mjg4d\" (UID: \"67f2c73f-2335-4f37-86ba-0dec25c93c9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.145501 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde-stats-auth\") pod \"router-default-5444994796-89b4n\" (UID: \"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde\") " pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.145504 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5hqk\" (UniqueName: \"kubernetes.io/projected/f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde-kube-api-access-j5hqk\") pod \"router-default-5444994796-89b4n\" (UID: \"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde\") " pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.145543 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b6be43bc-4f78-415c-abf8-c18e9bb5e21c-signing-cabundle\") pod \"service-ca-9c57cc56f-x24g6\" (UID: \"b6be43bc-4f78-415c-abf8-c18e9bb5e21c\") " pod="openshift-service-ca/service-ca-9c57cc56f-x24g6" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.145560 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde-service-ca-bundle\") pod \"router-default-5444994796-89b4n\" (UID: \"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde\") " pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.145576 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b74d087-5bad-4236-bad9-9808abed29e2-metrics-tls\") pod \"dns-default-mfhrt\" (UID: \"0b74d087-5bad-4236-bad9-9808abed29e2\") " pod="openshift-dns/dns-default-mfhrt" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.145617 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7f4b9f7-b478-4e60-a533-67a7ab786f86-proxy-tls\") pod \"machine-config-operator-74547568cd-kggxw\" (UID: \"e7f4b9f7-b478-4e60-a533-67a7ab786f86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.145631 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wcv5\" (UniqueName: \"kubernetes.io/projected/0b74d087-5bad-4236-bad9-9808abed29e2-kube-api-access-4wcv5\") pod \"dns-default-mfhrt\" (UID: \"0b74d087-5bad-4236-bad9-9808abed29e2\") " pod="openshift-dns/dns-default-mfhrt" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.145657 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c8d8e49-9ca8-425b-ac37-9409980c4ff7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8qk5x\" (UID: \"0c8d8e49-9ca8-425b-ac37-9409980c4ff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.145674 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nndmw\" (UniqueName: \"kubernetes.io/projected/1ad3cb2f-89ef-4f6e-9d48-f3eb33e4581c-kube-api-access-nndmw\") pod \"control-plane-machine-set-operator-78cbb6b69f-hfsls\" (UID: \"1ad3cb2f-89ef-4f6e-9d48-f3eb33e4581c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hfsls" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.146217 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b6be43bc-4f78-415c-abf8-c18e9bb5e21c-signing-key\") pod \"service-ca-9c57cc56f-x24g6\" (UID: \"b6be43bc-4f78-415c-abf8-c18e9bb5e21c\") " pod="openshift-service-ca/service-ca-9c57cc56f-x24g6" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.146879 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b6be43bc-4f78-415c-abf8-c18e9bb5e21c-signing-cabundle\") pod \"service-ca-9c57cc56f-x24g6\" (UID: \"b6be43bc-4f78-415c-abf8-c18e9bb5e21c\") " pod="openshift-service-ca/service-ca-9c57cc56f-x24g6" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.147444 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde-service-ca-bundle\") pod \"router-default-5444994796-89b4n\" (UID: \"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde\") " pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.145705 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/61b3a667-550a-4955-91cc-8cd14d2a77c0-node-bootstrap-token\") pod \"machine-config-server-h6jst\" (UID: \"61b3a667-550a-4955-91cc-8cd14d2a77c0\") " pod="openshift-machine-config-operator/machine-config-server-h6jst" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.147716 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjnlf\" (UniqueName: \"kubernetes.io/projected/bf46954d-2487-49ae-92ab-38c47c77c9c2-kube-api-access-bjnlf\") pod \"packageserver-d55dfcdfc-pcn44\" (UID: \"bf46954d-2487-49ae-92ab-38c47c77c9c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.147733 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbtfh\" (UniqueName: \"kubernetes.io/projected/479aaab8-5f0c-4c81-9f9e-60b3ca4e2ec0-kube-api-access-bbtfh\") pod \"ingress-canary-8tph2\" (UID: \"479aaab8-5f0c-4c81-9f9e-60b3ca4e2ec0\") " pod="openshift-ingress-canary/ingress-canary-8tph2" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.147750 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f395140e-7a1c-45ce-8eab-9d11bf757838-csi-data-dir\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.147772 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5v8t\" (UniqueName: \"kubernetes.io/projected/5cf77011-0fae-4535-8564-b6393dfe49bb-kube-api-access-d5v8t\") pod \"service-ca-operator-777779d784-nznv9\" (UID: \"5cf77011-0fae-4535-8564-b6393dfe49bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nznv9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.147794 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxdwp\" (UniqueName: \"kubernetes.io/projected/3e2298f3-9cb9-4307-b3ab-1e9eb349788b-kube-api-access-fxdwp\") pod \"multus-admission-controller-857f4d67dd-c9jnh\" (UID: \"3e2298f3-9cb9-4307-b3ab-1e9eb349788b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c9jnh" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.147809 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e7f4b9f7-b478-4e60-a533-67a7ab786f86-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kggxw\" (UID: \"e7f4b9f7-b478-4e60-a533-67a7ab786f86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.147832 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgg5n\" (UniqueName: \"kubernetes.io/projected/67f2c73f-2335-4f37-86ba-0dec25c93c9e-kube-api-access-sgg5n\") pod \"catalog-operator-68c6474976-mjg4d\" (UID: \"67f2c73f-2335-4f37-86ba-0dec25c93c9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.147855 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/54e1caa8-222c-4e43-a2e5-c38cd995eaf1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xwpfl\" (UID: \"54e1caa8-222c-4e43-a2e5-c38cd995eaf1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.147876 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde-metrics-certs\") pod \"router-default-5444994796-89b4n\" (UID: \"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde\") " pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.147893 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/24653880-7b0f-4174-ac74-5d13d99975e9-images\") pod \"machine-api-operator-5694c8668f-vs7jr\" (UID: \"24653880-7b0f-4174-ac74-5d13d99975e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.147909 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ad3cb2f-89ef-4f6e-9d48-f3eb33e4581c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hfsls\" (UID: \"1ad3cb2f-89ef-4f6e-9d48-f3eb33e4581c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hfsls" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.147927 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm8vv\" (UniqueName: \"kubernetes.io/projected/61b3a667-550a-4955-91cc-8cd14d2a77c0-kube-api-access-sm8vv\") pod \"machine-config-server-h6jst\" (UID: \"61b3a667-550a-4955-91cc-8cd14d2a77c0\") " pod="openshift-machine-config-operator/machine-config-server-h6jst" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.147945 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whvlq\" (UniqueName: \"kubernetes.io/projected/24653880-7b0f-4174-ac74-5d13d99975e9-kube-api-access-whvlq\") pod \"machine-api-operator-5694c8668f-vs7jr\" (UID: \"24653880-7b0f-4174-ac74-5d13d99975e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.147962 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12a1a6a0-cbae-49a3-8ca6-f015d67a70cd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-94qs9\" (UID: \"12a1a6a0-cbae-49a3-8ca6-f015d67a70cd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.147992 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blkxl\" (UniqueName: \"kubernetes.io/projected/b6be43bc-4f78-415c-abf8-c18e9bb5e21c-kube-api-access-blkxl\") pod \"service-ca-9c57cc56f-x24g6\" (UID: \"b6be43bc-4f78-415c-abf8-c18e9bb5e21c\") " pod="openshift-service-ca/service-ca-9c57cc56f-x24g6" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.148045 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f5d3875-9256-494d-a41d-5f4011c4462d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv5vb\" (UID: \"3f5d3875-9256-494d-a41d-5f4011c4462d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.148062 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e088d04-9b0c-45d3-8b98-d97b4065418c-proxy-tls\") pod \"machine-config-controller-84d6567774-45rpq\" (UID: \"5e088d04-9b0c-45d3-8b98-d97b4065418c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.148091 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bf46954d-2487-49ae-92ab-38c47c77c9c2-tmpfs\") pod \"packageserver-d55dfcdfc-pcn44\" (UID: \"bf46954d-2487-49ae-92ab-38c47c77c9c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.148106 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f5d3875-9256-494d-a41d-5f4011c4462d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv5vb\" (UID: \"3f5d3875-9256-494d-a41d-5f4011c4462d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.148130 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-secret-volume\") pod \"collect-profiles-29416080-95j7p\" (UID: \"3e2c6d12-1e18-498c-82e4-9c778e7c4aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.148146 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjpnn\" (UniqueName: \"kubernetes.io/projected/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-kube-api-access-rjpnn\") pod \"collect-profiles-29416080-95j7p\" (UID: \"3e2c6d12-1e18-498c-82e4-9c778e7c4aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.148565 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-config-volume\") pod \"collect-profiles-29416080-95j7p\" (UID: \"3e2c6d12-1e18-498c-82e4-9c778e7c4aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.148992 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/67f2c73f-2335-4f37-86ba-0dec25c93c9e-srv-cert\") pod \"catalog-operator-68c6474976-mjg4d\" (UID: \"67f2c73f-2335-4f37-86ba-0dec25c93c9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.149057 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/67f2c73f-2335-4f37-86ba-0dec25c93c9e-profile-collector-cert\") pod \"catalog-operator-68c6474976-mjg4d\" (UID: \"67f2c73f-2335-4f37-86ba-0dec25c93c9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.149503 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde-default-certificate\") pod \"router-default-5444994796-89b4n\" (UID: \"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde\") " pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.150625 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bf46954d-2487-49ae-92ab-38c47c77c9c2-tmpfs\") pod \"packageserver-d55dfcdfc-pcn44\" (UID: \"bf46954d-2487-49ae-92ab-38c47c77c9c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.150675 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bf46954d-2487-49ae-92ab-38c47c77c9c2-apiservice-cert\") pod \"packageserver-d55dfcdfc-pcn44\" (UID: \"bf46954d-2487-49ae-92ab-38c47c77c9c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.150792 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f395140e-7a1c-45ce-8eab-9d11bf757838-csi-data-dir\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.150869 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5e088d04-9b0c-45d3-8b98-d97b4065418c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-45rpq\" (UID: \"5e088d04-9b0c-45d3-8b98-d97b4065418c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.151449 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/106ffd61-239f-4707-b999-aa044f6f30ae-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n7qfd\" (UID: \"106ffd61-239f-4707-b999-aa044f6f30ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.151507 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e7f4b9f7-b478-4e60-a533-67a7ab786f86-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kggxw\" (UID: \"e7f4b9f7-b478-4e60-a533-67a7ab786f86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.151802 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7f4b9f7-b478-4e60-a533-67a7ab786f86-proxy-tls\") pod \"machine-config-operator-74547568cd-kggxw\" (UID: \"e7f4b9f7-b478-4e60-a533-67a7ab786f86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.151849 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b74d087-5bad-4236-bad9-9808abed29e2-metrics-tls\") pod \"dns-default-mfhrt\" (UID: \"0b74d087-5bad-4236-bad9-9808abed29e2\") " pod="openshift-dns/dns-default-mfhrt" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.151909 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24653880-7b0f-4174-ac74-5d13d99975e9-config\") pod \"machine-api-operator-5694c8668f-vs7jr\" (UID: \"24653880-7b0f-4174-ac74-5d13d99975e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.151603 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f395140e-7a1c-45ce-8eab-9d11bf757838-plugins-dir\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.154784 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f5d3875-9256-494d-a41d-5f4011c4462d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv5vb\" (UID: \"3f5d3875-9256-494d-a41d-5f4011c4462d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.155309 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-bound-sa-token\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.155664 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/61b3a667-550a-4955-91cc-8cd14d2a77c0-node-bootstrap-token\") pod \"machine-config-server-h6jst\" (UID: \"61b3a667-550a-4955-91cc-8cd14d2a77c0\") " pod="openshift-machine-config-operator/machine-config-server-h6jst" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.158261 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ad3cb2f-89ef-4f6e-9d48-f3eb33e4581c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hfsls\" (UID: \"1ad3cb2f-89ef-4f6e-9d48-f3eb33e4581c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hfsls" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.163680 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c8d8e49-9ca8-425b-ac37-9409980c4ff7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8qk5x\" (UID: \"0c8d8e49-9ca8-425b-ac37-9409980c4ff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.164618 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e088d04-9b0c-45d3-8b98-d97b4065418c-proxy-tls\") pod \"machine-config-controller-84d6567774-45rpq\" (UID: \"5e088d04-9b0c-45d3-8b98-d97b4065418c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.164804 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde-metrics-certs\") pod \"router-default-5444994796-89b4n\" (UID: \"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde\") " pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.164955 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/54e1caa8-222c-4e43-a2e5-c38cd995eaf1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xwpfl\" (UID: \"54e1caa8-222c-4e43-a2e5-c38cd995eaf1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.165005 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-secret-volume\") pod \"collect-profiles-29416080-95j7p\" (UID: \"3e2c6d12-1e18-498c-82e4-9c778e7c4aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.166319 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f5d3875-9256-494d-a41d-5f4011c4462d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv5vb\" (UID: \"3f5d3875-9256-494d-a41d-5f4011c4462d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.169959 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc8nb\" (UniqueName: \"kubernetes.io/projected/fa341b17-db5b-487c-9a0a-446a1842a78e-kube-api-access-pc8nb\") pod \"dns-operator-744455d44c-x67dc\" (UID: \"fa341b17-db5b-487c-9a0a-446a1842a78e\") " pod="openshift-dns-operator/dns-operator-744455d44c-x67dc" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.170128 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/24653880-7b0f-4174-ac74-5d13d99975e9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vs7jr\" (UID: \"24653880-7b0f-4174-ac74-5d13d99975e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.171260 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12a1a6a0-cbae-49a3-8ca6-f015d67a70cd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-94qs9\" (UID: \"12a1a6a0-cbae-49a3-8ca6-f015d67a70cd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.172955 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/24653880-7b0f-4174-ac74-5d13d99975e9-images\") pod \"machine-api-operator-5694c8668f-vs7jr\" (UID: \"24653880-7b0f-4174-ac74-5d13d99975e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.192247 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n"] Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.199906 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v"] Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.216700 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqgqx\" (UniqueName: \"kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-kube-api-access-qqgqx\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.241552 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk28r\" (UniqueName: \"kubernetes.io/projected/f395140e-7a1c-45ce-8eab-9d11bf757838-kube-api-access-kk28r\") pod \"csi-hostpathplugin-bvl5h\" (UID: \"f395140e-7a1c-45ce-8eab-9d11bf757838\") " pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.252348 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:02 crc kubenswrapper[4885]: E1205 20:08:02.252825 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:02.7528062 +0000 UTC m=+148.049621871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.258799 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f5d3875-9256-494d-a41d-5f4011c4462d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv5vb\" (UID: \"3f5d3875-9256-494d-a41d-5f4011c4462d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.274254 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fgdjd" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.276481 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwwzg\" (UniqueName: \"kubernetes.io/projected/12a1a6a0-cbae-49a3-8ca6-f015d67a70cd-kube-api-access-mwwzg\") pod \"kube-storage-version-migrator-operator-b67b599dd-94qs9\" (UID: \"12a1a6a0-cbae-49a3-8ca6-f015d67a70cd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.292728 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh99j\" (UniqueName: \"kubernetes.io/projected/54e1caa8-222c-4e43-a2e5-c38cd995eaf1-kube-api-access-nh99j\") pod \"package-server-manager-789f6589d5-xwpfl\" (UID: \"54e1caa8-222c-4e43-a2e5-c38cd995eaf1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.308635 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-x67dc" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.309084 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.313159 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.314048 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl9nk\" (UniqueName: \"kubernetes.io/projected/106ffd61-239f-4707-b999-aa044f6f30ae-kube-api-access-fl9nk\") pod \"marketplace-operator-79b997595-n7qfd\" (UID: \"106ffd61-239f-4707-b999-aa044f6f30ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.328446 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmvfr\" (UniqueName: \"kubernetes.io/projected/5e088d04-9b0c-45d3-8b98-d97b4065418c-kube-api-access-hmvfr\") pod \"machine-config-controller-84d6567774-45rpq\" (UID: \"5e088d04-9b0c-45d3-8b98-d97b4065418c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.329000 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.352298 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8jpp\" (UniqueName: \"kubernetes.io/projected/e7f4b9f7-b478-4e60-a533-67a7ab786f86-kube-api-access-x8jpp\") pod \"machine-config-operator-74547568cd-kggxw\" (UID: \"e7f4b9f7-b478-4e60-a533-67a7ab786f86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.364277 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.364608 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: E1205 20:08:02.364970 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:02.864958762 +0000 UTC m=+148.161774423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.375560 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz"] Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.380072 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5hqk\" (UniqueName: \"kubernetes.io/projected/f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde-kube-api-access-j5hqk\") pod \"router-default-5444994796-89b4n\" (UID: \"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde\") " pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.394185 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjpnn\" (UniqueName: \"kubernetes.io/projected/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-kube-api-access-rjpnn\") pod \"collect-profiles-29416080-95j7p\" (UID: \"3e2c6d12-1e18-498c-82e4-9c778e7c4aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.408470 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.408640 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c8d8e49-9ca8-425b-ac37-9409980c4ff7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8qk5x\" (UID: \"0c8d8e49-9ca8-425b-ac37-9409980c4ff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.415802 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.432592 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.436625 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbtfh\" (UniqueName: \"kubernetes.io/projected/479aaab8-5f0c-4c81-9f9e-60b3ca4e2ec0-kube-api-access-bbtfh\") pod \"ingress-canary-8tph2\" (UID: \"479aaab8-5f0c-4c81-9f9e-60b3ca4e2ec0\") " pod="openshift-ingress-canary/ingress-canary-8tph2" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.451231 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wcv5\" (UniqueName: \"kubernetes.io/projected/0b74d087-5bad-4236-bad9-9808abed29e2-kube-api-access-4wcv5\") pod \"dns-default-mfhrt\" (UID: \"0b74d087-5bad-4236-bad9-9808abed29e2\") " pod="openshift-dns/dns-default-mfhrt" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.459993 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.466684 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.467441 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:02 crc kubenswrapper[4885]: E1205 20:08:02.467762 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:02.967712752 +0000 UTC m=+148.264528413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.468058 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: E1205 20:08:02.468440 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:02.968432346 +0000 UTC m=+148.265248007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.470184 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blkxl\" (UniqueName: \"kubernetes.io/projected/b6be43bc-4f78-415c-abf8-c18e9bb5e21c-kube-api-access-blkxl\") pod \"service-ca-9c57cc56f-x24g6\" (UID: \"b6be43bc-4f78-415c-abf8-c18e9bb5e21c\") " pod="openshift-service-ca/service-ca-9c57cc56f-x24g6" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.475729 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.485948 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mfhrt" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.490931 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.494399 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nndmw\" (UniqueName: \"kubernetes.io/projected/1ad3cb2f-89ef-4f6e-9d48-f3eb33e4581c-kube-api-access-nndmw\") pod \"control-plane-machine-set-operator-78cbb6b69f-hfsls\" (UID: \"1ad3cb2f-89ef-4f6e-9d48-f3eb33e4581c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hfsls" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.508403 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8tph2" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.511293 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxdwp\" (UniqueName: \"kubernetes.io/projected/3e2298f3-9cb9-4307-b3ab-1e9eb349788b-kube-api-access-fxdwp\") pod \"multus-admission-controller-857f4d67dd-c9jnh\" (UID: \"3e2298f3-9cb9-4307-b3ab-1e9eb349788b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c9jnh" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.516345 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-z2sb2"] Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.546925 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5v8t\" (UniqueName: \"kubernetes.io/projected/5cf77011-0fae-4535-8564-b6393dfe49bb-kube-api-access-d5v8t\") pod \"service-ca-operator-777779d784-nznv9\" (UID: \"5cf77011-0fae-4535-8564-b6393dfe49bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nznv9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.549865 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgg5n\" (UniqueName: \"kubernetes.io/projected/67f2c73f-2335-4f37-86ba-0dec25c93c9e-kube-api-access-sgg5n\") pod \"catalog-operator-68c6474976-mjg4d\" (UID: \"67f2c73f-2335-4f37-86ba-0dec25c93c9e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.570218 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:02 crc kubenswrapper[4885]: E1205 20:08:02.570588 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:03.070573954 +0000 UTC m=+148.367389615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.588691 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm8vv\" (UniqueName: \"kubernetes.io/projected/61b3a667-550a-4955-91cc-8cd14d2a77c0-kube-api-access-sm8vv\") pod \"machine-config-server-h6jst\" (UID: \"61b3a667-550a-4955-91cc-8cd14d2a77c0\") " pod="openshift-machine-config-operator/machine-config-server-h6jst" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.604751 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjnlf\" (UniqueName: \"kubernetes.io/projected/bf46954d-2487-49ae-92ab-38c47c77c9c2-kube-api-access-bjnlf\") pod \"packageserver-d55dfcdfc-pcn44\" (UID: \"bf46954d-2487-49ae-92ab-38c47c77c9c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.611355 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whvlq\" (UniqueName: \"kubernetes.io/projected/24653880-7b0f-4174-ac74-5d13d99975e9-kube-api-access-whvlq\") pod \"machine-api-operator-5694c8668f-vs7jr\" (UID: \"24653880-7b0f-4174-ac74-5d13d99975e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.642559 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.650936 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hfsls" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.671086 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659"] Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.671300 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.671680 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: E1205 20:08:02.672010 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:03.17199856 +0000 UTC m=+148.468814221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.675902 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-x24g6" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.685148 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9n5ft"] Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.691914 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr"] Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.692324 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.698646 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-c9jnh" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.722425 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.739717 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nznv9" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.772542 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:02 crc kubenswrapper[4885]: E1205 20:08:02.772869 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:03.272849516 +0000 UTC m=+148.569665177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.802197 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-h6jst" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.803871 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fgdjd"] Dec 05 20:08:02 crc kubenswrapper[4885]: W1205 20:08:02.816382 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5fe14c1_a7b0_4f3c_bbb6_9863c38e0bde.slice/crio-389c0815de9e4f5790dc4316816f3324ba665bfd6592dcb629549be547758692 WatchSource:0}: Error finding container 389c0815de9e4f5790dc4316816f3324ba665bfd6592dcb629549be547758692: Status 404 returned error can't find the container with id 389c0815de9e4f5790dc4316816f3324ba665bfd6592dcb629549be547758692 Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.817254 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.884273 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:02 crc kubenswrapper[4885]: E1205 20:08:02.884689 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:03.384673157 +0000 UTC m=+148.681488818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.905292 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s"] Dec 05 20:08:02 crc kubenswrapper[4885]: I1205 20:08:02.984780 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:02 crc kubenswrapper[4885]: E1205 20:08:02.986750 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:03.486730024 +0000 UTC m=+148.783545695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.018988 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" event={"ID":"875142b2-83ed-4d64-88b5-a885640981d1","Type":"ContainerStarted","Data":"f65add80b31f33aab8fa6e23a8b830ea8e75b62acf0a54a2250fa6f4dc06847c"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.028993 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-x67dc"] Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.034011 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-jdrlk" podStartSLOduration=126.033992507 podStartE2EDuration="2m6.033992507s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:03.033096336 +0000 UTC m=+148.329911997" watchObservedRunningTime="2025-12-05 20:08:03.033992507 +0000 UTC m=+148.330808168" Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.038950 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-89b4n" event={"ID":"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde","Type":"ContainerStarted","Data":"389c0815de9e4f5790dc4316816f3324ba665bfd6592dcb629549be547758692"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.056272 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6" event={"ID":"42769059-74c9-48f7-bdb2-7b97903610ba","Type":"ContainerStarted","Data":"6525a733708fdb615ba351337becb397678a056a9910dc5db26b86b63702b8eb"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.069488 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" event={"ID":"d4e08709-09d1-497f-9d79-83b90f495bbf","Type":"ContainerStarted","Data":"143203c835851577823e07cda060cd82008fd286dced0a61d59c496041683bdb"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.089424 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:03 crc kubenswrapper[4885]: E1205 20:08:03.091512 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:03.591476909 +0000 UTC m=+148.888292570 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.101120 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" event={"ID":"c9e46b72-f528-4f07-8b1e-96b98302ac86","Type":"ContainerStarted","Data":"df42832dde0166224b318fc872ebc3d57ae7022ed6e6e4b4f34e7ccc3f0dfab9"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.101165 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" event={"ID":"c9e46b72-f528-4f07-8b1e-96b98302ac86","Type":"ContainerStarted","Data":"56375deacdad25513422b9c51bab324fb3d7e97f288385e871f5a764e6acdc4e"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.108567 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.113883 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n" event={"ID":"3a17b36b-0a7a-427b-9602-27aa06f15f73","Type":"ContainerStarted","Data":"a079bdd1c3d3ba3c17c4a8690ed84cc7cc752b34de3f54a482d988c9a4ede3fb"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.113932 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n" event={"ID":"3a17b36b-0a7a-427b-9602-27aa06f15f73","Type":"ContainerStarted","Data":"0cf92f6857f1c4785c3625e0efc001deb6492ad038345c215c2a15a73190cb50"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.120503 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.146814 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9n5ft" event={"ID":"f1540bd7-e50c-4f68-864c-a58e8c81bb03","Type":"ContainerStarted","Data":"69d360f7a3124713adb83bb76fe26db416fbd70efa18e9e9ff991f59d83971a2"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.147548 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qcd9b"] Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.148316 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" event={"ID":"ac8a91ac-c2d0-40b7-aa23-cf2a0081e550","Type":"ContainerStarted","Data":"f7a9156007e856532cc7e9ddd8e997d5e34468c24d414f58e49daa322c0e99f7"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.148358 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" event={"ID":"ac8a91ac-c2d0-40b7-aa23-cf2a0081e550","Type":"ContainerStarted","Data":"3766fb50c2dbeeacf19262ab1ccc185f477eccecfeceaa6325a48b91c1c525f6"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.159985 4885 generic.go:334] "Generic (PLEG): container finished" podID="dc1ce980-9bdc-4b28-9f12-ab17b79b981c" containerID="60092eaeed1541a774c214703e6fd46add1cd6aa792c2e32bc556b98be657897" exitCode=0 Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.160568 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" event={"ID":"dc1ce980-9bdc-4b28-9f12-ab17b79b981c","Type":"ContainerDied","Data":"60092eaeed1541a774c214703e6fd46add1cd6aa792c2e32bc556b98be657897"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.160600 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" event={"ID":"dc1ce980-9bdc-4b28-9f12-ab17b79b981c","Type":"ContainerStarted","Data":"3381ad33e3a69cfbf48ac6fe9e96b75a99e5ff981660d14ce5b875d27f8c3599"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.161352 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" event={"ID":"34f5add1-5763-4b13-8058-e1b6fbbb4740","Type":"ContainerStarted","Data":"d61f4330cc806d9407145dfa60d08efd1de135fea6f12049312ddcf3250e0675"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.162128 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz" event={"ID":"a5953ed8-08cc-443c-a9e0-be5f96f3d8dd","Type":"ContainerStarted","Data":"eb3e881c406372d0b3b632feba2dc2c620169fb1197a741a927e02c0cb94b63c"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.165478 4885 generic.go:334] "Generic (PLEG): container finished" podID="6e273248-d3b5-4248-9e30-06ae7c6ab889" containerID="ce4efd6f0c8810f16ebd7755416bfa63b2ccc78d06434a3643643f559e97d30e" exitCode=0 Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.165531 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wdpql" event={"ID":"6e273248-d3b5-4248-9e30-06ae7c6ab889","Type":"ContainerDied","Data":"ce4efd6f0c8810f16ebd7755416bfa63b2ccc78d06434a3643643f559e97d30e"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.167756 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-z2sb2" event={"ID":"6086abe8-3970-4d1c-9f3f-8075de87b8ec","Type":"ContainerStarted","Data":"3e058f208b5d8258dcf3f02074bb806b2065aa5c303f073e1d8c3daf0ec1f6f5"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.170995 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" event={"ID":"3f0c2333-b88a-430a-b7a7-0882ef369aab","Type":"ContainerStarted","Data":"e9daf7990c910750851e396426e5080845e09f725d093b180d4acf3b25fe08be"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.190872 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:03 crc kubenswrapper[4885]: E1205 20:08:03.191226 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:03.691209048 +0000 UTC m=+148.988024709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.198307 4885 patch_prober.go:28] interesting pod/downloads-7954f5f757-jdls9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.198351 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jdls9" podUID="98e1f477-999e-4584-a373-c07abd3a938c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.259720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8c72242aa54e9a89f4b4a702d7278c60867bd0525c4b80093ea2aafc9aa7d54b"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.259776 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.259789 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" event={"ID":"d3b6b866-d318-454f-9730-e56de394d130","Type":"ContainerStarted","Data":"90a0d0d385eaeebc3ee2bb6441c4624679e12c7bdddf69ffc7748700fced242f"} Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.294049 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:03 crc kubenswrapper[4885]: E1205 20:08:03.302453 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:03.80243717 +0000 UTC m=+149.099252831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.386683 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-jdls9" podStartSLOduration=126.386665453 podStartE2EDuration="2m6.386665453s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:03.277353445 +0000 UTC m=+148.574169136" watchObservedRunningTime="2025-12-05 20:08:03.386665453 +0000 UTC m=+148.683481114" Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.394946 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:03 crc kubenswrapper[4885]: E1205 20:08:03.395199 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:03.895185056 +0000 UTC m=+149.192000717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.493547 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-chk47" podStartSLOduration=126.493527019 podStartE2EDuration="2m6.493527019s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:03.492525175 +0000 UTC m=+148.789340836" watchObservedRunningTime="2025-12-05 20:08:03.493527019 +0000 UTC m=+148.790342680" Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.496614 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:03 crc kubenswrapper[4885]: E1205 20:08:03.496901 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:03.99688789 +0000 UTC m=+149.293703551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.598718 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:03 crc kubenswrapper[4885]: E1205 20:08:03.599096 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:04.099080341 +0000 UTC m=+149.395896002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.700234 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:03 crc kubenswrapper[4885]: E1205 20:08:03.700780 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:04.200765106 +0000 UTC m=+149.497580767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.800842 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:03 crc kubenswrapper[4885]: E1205 20:08:03.801267 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:04.301253269 +0000 UTC m=+149.598068930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:03 crc kubenswrapper[4885]: I1205 20:08:03.902296 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:03 crc kubenswrapper[4885]: E1205 20:08:03.902652 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:04.402637843 +0000 UTC m=+149.699453504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.006046 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:04 crc kubenswrapper[4885]: E1205 20:08:04.006394 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:04.506379765 +0000 UTC m=+149.803195426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.108448 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:04 crc kubenswrapper[4885]: E1205 20:08:04.108984 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:04.60897275 +0000 UTC m=+149.905788411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.210705 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:04 crc kubenswrapper[4885]: E1205 20:08:04.210957 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:04.710939852 +0000 UTC m=+150.007755513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.216555 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" event={"ID":"c6e3f1cc-5218-44b2-b4bf-168dae1629b7","Type":"ContainerStarted","Data":"173ae9461c0864f9e339bec53fa259f78761acf0c88a916751f2fbc8d628d20d"} Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.239654 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-x67dc" event={"ID":"fa341b17-db5b-487c-9a0a-446a1842a78e","Type":"ContainerStarted","Data":"771c193d5b9274c78df698ee0d34be60a5b5155409ceb405daf1198255391183"} Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.269404 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9"] Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.299287 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" event={"ID":"34f5add1-5763-4b13-8058-e1b6fbbb4740","Type":"ContainerStarted","Data":"0a928fc1fc3a9789077f28ecd7ad4138d9276c48b718c5026a9df005e4a98913"} Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.300006 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.315962 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq"] Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.317374 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:04 crc kubenswrapper[4885]: E1205 20:08:04.318981 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:04.818959228 +0000 UTC m=+150.115774899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.320320 4885 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-sp659 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.320413 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" podUID="34f5add1-5763-4b13-8058-e1b6fbbb4740" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.320561 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-h6jst" event={"ID":"61b3a667-550a-4955-91cc-8cd14d2a77c0","Type":"ContainerStarted","Data":"a7809a23f1ae4ef9637c44cc55fc6db8c06fed1964875ee47dde1804abf38f93"} Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.342408 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fgdjd" event={"ID":"71071b19-6069-4981-afd0-e41274442bdb","Type":"ContainerStarted","Data":"1a8e30ceec8d7c9fbf3c3adbb55653c90d1718d8f0406c55e1ca30fb7e0be297"} Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.393068 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" event={"ID":"2973ae71-2365-4bac-ab40-9aa54317c587","Type":"ContainerStarted","Data":"e629d2b96dca395b439c6c165aac648578192d1b741a021d964863b58daca26f"} Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.394121 4885 patch_prober.go:28] interesting pod/downloads-7954f5f757-jdls9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.394162 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jdls9" podUID="98e1f477-999e-4584-a373-c07abd3a938c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.428461 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bvl5h"] Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.433448 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:04 crc kubenswrapper[4885]: E1205 20:08:04.434445 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:04.934429019 +0000 UTC m=+150.231244680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.436265 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8tph2"] Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.451859 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mhxlk" podStartSLOduration=127.451843479 podStartE2EDuration="2m7.451843479s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:04.450478284 +0000 UTC m=+149.747293935" watchObservedRunningTime="2025-12-05 20:08:04.451843479 +0000 UTC m=+149.748659140" Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.499581 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl"] Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.540435 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:04 crc kubenswrapper[4885]: E1205 20:08:04.545946 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:05.045931451 +0000 UTC m=+150.342747112 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.615443 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9f8n" podStartSLOduration=127.615427103 podStartE2EDuration="2m7.615427103s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:04.613424197 +0000 UTC m=+149.910239858" watchObservedRunningTime="2025-12-05 20:08:04.615427103 +0000 UTC m=+149.912242764" Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.616336 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" podStartSLOduration=127.616329183 podStartE2EDuration="2m7.616329183s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:04.556922267 +0000 UTC m=+149.853737928" watchObservedRunningTime="2025-12-05 20:08:04.616329183 +0000 UTC m=+149.913144844" Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.641159 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:04 crc kubenswrapper[4885]: W1205 20:08:04.641335 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54e1caa8_222c_4e43_a2e5_c38cd995eaf1.slice/crio-90b942ed450ce4e6810c2b83464e9f74997ea657c54533ee46b383c1afd5cfac WatchSource:0}: Error finding container 90b942ed450ce4e6810c2b83464e9f74997ea657c54533ee46b383c1afd5cfac: Status 404 returned error can't find the container with id 90b942ed450ce4e6810c2b83464e9f74997ea657c54533ee46b383c1afd5cfac Dec 05 20:08:04 crc kubenswrapper[4885]: E1205 20:08:04.641561 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:05.141542702 +0000 UTC m=+150.438358363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.695963 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-5g9z9" podStartSLOduration=127.695943143 podStartE2EDuration="2m7.695943143s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:04.695487137 +0000 UTC m=+149.992302798" watchObservedRunningTime="2025-12-05 20:08:04.695943143 +0000 UTC m=+149.992758804" Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.725559 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb"] Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.744759 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:04 crc kubenswrapper[4885]: E1205 20:08:04.745087 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:05.245075867 +0000 UTC m=+150.541891528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:04 crc kubenswrapper[4885]: W1205 20:08:04.773696 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f5d3875_9256_494d_a41d_5f4011c4462d.slice/crio-18eb9a7b4ead3f265fc534388ed04923f7f201d5d67870c8d645b5e51fd8203c WatchSource:0}: Error finding container 18eb9a7b4ead3f265fc534388ed04923f7f201d5d67870c8d645b5e51fd8203c: Status 404 returned error can't find the container with id 18eb9a7b4ead3f265fc534388ed04923f7f201d5d67870c8d645b5e51fd8203c Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.783293 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mfhrt"] Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.795700 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n7qfd"] Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.810056 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" podStartSLOduration=127.810042549 podStartE2EDuration="2m7.810042549s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:04.806604496 +0000 UTC m=+150.103420157" watchObservedRunningTime="2025-12-05 20:08:04.810042549 +0000 UTC m=+150.106858210" Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.810829 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p"] Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.839915 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-95dk6" podStartSLOduration=127.83980067 podStartE2EDuration="2m7.83980067s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:04.830904994 +0000 UTC m=+150.127720655" watchObservedRunningTime="2025-12-05 20:08:04.83980067 +0000 UTC m=+150.136616341" Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.846493 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:04 crc kubenswrapper[4885]: E1205 20:08:04.846860 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:05.346840524 +0000 UTC m=+150.643656195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.953892 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:04 crc kubenswrapper[4885]: E1205 20:08:04.954279 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:05.454265669 +0000 UTC m=+150.751081320 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:04 crc kubenswrapper[4885]: I1205 20:08:04.957219 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" podStartSLOduration=127.957209247 podStartE2EDuration="2m7.957209247s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:04.944577236 +0000 UTC m=+150.241392897" watchObservedRunningTime="2025-12-05 20:08:04.957209247 +0000 UTC m=+150.254024908" Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.007634 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-x24g6"] Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.030147 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6xd4d" podStartSLOduration=128.030125864 podStartE2EDuration="2m8.030125864s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:05.014136381 +0000 UTC m=+150.310952042" watchObservedRunningTime="2025-12-05 20:08:05.030125864 +0000 UTC m=+150.326941515" Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.045835 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nznv9"] Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.061114 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:05 crc kubenswrapper[4885]: E1205 20:08:05.061397 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:05.561377594 +0000 UTC m=+150.858193265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.122471 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw"] Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.162752 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:05 crc kubenswrapper[4885]: E1205 20:08:05.163099 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:05.663087838 +0000 UTC m=+150.959903499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.220838 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c9jnh"] Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.220874 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d"] Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.263528 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:05 crc kubenswrapper[4885]: E1205 20:08:05.264414 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:05.764376289 +0000 UTC m=+151.061191950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.285875 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hfsls"] Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.301295 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44"] Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.364704 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:05 crc kubenswrapper[4885]: E1205 20:08:05.365238 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:05.865226545 +0000 UTC m=+151.162042206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.443610 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hfsls" event={"ID":"1ad3cb2f-89ef-4f6e-9d48-f3eb33e4581c","Type":"ContainerStarted","Data":"301d45c4a9b37648bace004a0a7d6c08b86e0dc1a4d777481148c474d588cd36"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.465590 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:05 crc kubenswrapper[4885]: E1205 20:08:05.466353 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:05.966313589 +0000 UTC m=+151.263129250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.468544 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9" event={"ID":"12a1a6a0-cbae-49a3-8ca6-f015d67a70cd","Type":"ContainerStarted","Data":"7a2323dd4456346ccaacbd3fbb8c9418170ef24ffbe6acac3e006c225dc6fbc4"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.468597 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9" event={"ID":"12a1a6a0-cbae-49a3-8ca6-f015d67a70cd","Type":"ContainerStarted","Data":"b11c529ed4b04a88a102bfcd8f3b84e4573d930f6c8f41eed556d705b9dfe8ea"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.480067 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-z2sb2" event={"ID":"6086abe8-3970-4d1c-9f3f-8075de87b8ec","Type":"ContainerStarted","Data":"2462db629e2faa28f150a0236d679981c034ff593ffb3025ad5588465e76a262"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.482200 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-z2sb2" Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.485200 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c9jnh" event={"ID":"3e2298f3-9cb9-4307-b3ab-1e9eb349788b","Type":"ContainerStarted","Data":"793b3ebe2fdce47f58968510feb9d4c33ce1a8f450e4826bc12ba3a7fb89d180"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.495410 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wdpql" event={"ID":"6e273248-d3b5-4248-9e30-06ae7c6ab889","Type":"ContainerStarted","Data":"7b4ce702caa572910692ff68f3e4ff7508f6e8b1ece5468077af0b7ea53170ae"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.531256 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vs7jr"] Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.544712 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9n5ft" event={"ID":"f1540bd7-e50c-4f68-864c-a58e8c81bb03","Type":"ContainerStarted","Data":"726441ba22b4d7d48a8f9c01f13000734a6fe76441e829d12a0ffece4dccf411"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.544757 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9n5ft" event={"ID":"f1540bd7-e50c-4f68-864c-a58e8c81bb03","Type":"ContainerStarted","Data":"8e526a42df3d633a9976bef1d4f22a51f600a5e82c69df4d0cb7b7a91db1853b"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.562317 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" event={"ID":"3f0c2333-b88a-430a-b7a7-0882ef369aab","Type":"ContainerStarted","Data":"83be34dba15a8f94e963c97797475da4293c679c505e08c5b204dae2a696eb93"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.563286 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.568087 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.568198 4885 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-6r5qr container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.568239 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" podUID="3f0c2333-b88a-430a-b7a7-0882ef369aab" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Dec 05 20:08:05 crc kubenswrapper[4885]: E1205 20:08:05.570137 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:06.070121394 +0000 UTC m=+151.366937145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.571466 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x"] Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.588732 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" event={"ID":"c6e3f1cc-5218-44b2-b4bf-168dae1629b7","Type":"ContainerStarted","Data":"0d04c77a778a3eb8a085b9171fce3ee8783c664fe627eda97cdf2c2855cf1d4e"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.589634 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.617903 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" event={"ID":"2973ae71-2365-4bac-ab40-9aa54317c587","Type":"ContainerStarted","Data":"6289bbc53ecabfc242d585acead9c56c9f8d46fb20cd5ded5ee1d833f35b0914"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.617941 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" event={"ID":"2973ae71-2365-4bac-ab40-9aa54317c587","Type":"ContainerStarted","Data":"affc322a2a5159ba72ede8dc9758fadea8254bf580d81430fc0abb13e5d430ea"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.620204 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-x24g6" event={"ID":"b6be43bc-4f78-415c-abf8-c18e9bb5e21c","Type":"ContainerStarted","Data":"76c3fdabe74098922dd9dc99e9d5e52e8045b44cf38b133a1be55edba1fd9b34"} Dec 05 20:08:05 crc kubenswrapper[4885]: W1205 20:08:05.624549 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24653880_7b0f_4174_ac74_5d13d99975e9.slice/crio-d144130c6829ee0a1c452dcdaf652498342f4e0b181a857b945e715fa2425902 WatchSource:0}: Error finding container d144130c6829ee0a1c452dcdaf652498342f4e0b181a857b945e715fa2425902: Status 404 returned error can't find the container with id d144130c6829ee0a1c452dcdaf652498342f4e0b181a857b945e715fa2425902 Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.628237 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" event={"ID":"106ffd61-239f-4707-b999-aa044f6f30ae","Type":"ContainerStarted","Data":"ed6590a8bfa567fa8b50c10f01cc6923217488f41884aac615b8fbcd5d38a46f"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.645137 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mfhrt" event={"ID":"0b74d087-5bad-4236-bad9-9808abed29e2","Type":"ContainerStarted","Data":"edab6a32815a9cc185f632da5b09ef7772147074f771f09b18472a1abfdf9951"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.648084 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb" event={"ID":"3f5d3875-9256-494d-a41d-5f4011c4462d","Type":"ContainerStarted","Data":"18eb9a7b4ead3f265fc534388ed04923f7f201d5d67870c8d645b5e51fd8203c"} Dec 05 20:08:05 crc kubenswrapper[4885]: W1205 20:08:05.651293 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c8d8e49_9ca8_425b_ac37_9409980c4ff7.slice/crio-5f3154067dff0b0f05712f87b24d6e6d772b6fc22e14e98b3fb9861654225f22 WatchSource:0}: Error finding container 5f3154067dff0b0f05712f87b24d6e6d772b6fc22e14e98b3fb9861654225f22: Status 404 returned error can't find the container with id 5f3154067dff0b0f05712f87b24d6e6d772b6fc22e14e98b3fb9861654225f22 Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.652353 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-89b4n" event={"ID":"f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde","Type":"ContainerStarted","Data":"679d87627f9e921c59885e7ff54401666668a76fd4f2eb95f8ca40b0e785c5d5"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.668931 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:05 crc kubenswrapper[4885]: E1205 20:08:05.669813 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:06.16979871 +0000 UTC m=+151.466614371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.693360 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-x67dc" event={"ID":"fa341b17-db5b-487c-9a0a-446a1842a78e","Type":"ContainerStarted","Data":"ad102ad38fb50f5ab1422f5d94532adf13862b175be790361186d54d87776131"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.704713 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" event={"ID":"bf46954d-2487-49ae-92ab-38c47c77c9c2","Type":"ContainerStarted","Data":"735731cbe48a92833f949ef408d7e7859e78baa1767bb7fc33d9c44d2b646d4d"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.705582 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" event={"ID":"f395140e-7a1c-45ce-8eab-9d11bf757838","Type":"ContainerStarted","Data":"98d09fb29ed329476925e55663aeb455131aa2d3f26c8e219fd6921493258d58"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.706327 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" event={"ID":"e7f4b9f7-b478-4e60-a533-67a7ab786f86","Type":"ContainerStarted","Data":"4109333b48966137d236711b23afe5c9d37fb60d76c66cb53ac9278e9a05f95d"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.755344 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl" event={"ID":"54e1caa8-222c-4e43-a2e5-c38cd995eaf1","Type":"ContainerStarted","Data":"90b942ed450ce4e6810c2b83464e9f74997ea657c54533ee46b383c1afd5cfac"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.769644 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:05 crc kubenswrapper[4885]: E1205 20:08:05.769935 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:06.269921512 +0000 UTC m=+151.566737173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.802586 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" event={"ID":"3e2c6d12-1e18-498c-82e4-9c778e7c4aea","Type":"ContainerStarted","Data":"c5a7bc59d4a487d07d1941d9077d09dbd7162fa2a7654271493cd94161b1e4b9"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.823931 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fgdjd" event={"ID":"71071b19-6069-4981-afd0-e41274442bdb","Type":"ContainerStarted","Data":"74b90b9e0c0d6ab566356c1722d40470c9df09e88be62d7bc73fecaeb7d0bb37"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.823967 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fgdjd" event={"ID":"71071b19-6069-4981-afd0-e41274442bdb","Type":"ContainerStarted","Data":"c6b2aaa318c0ffa8d6181851efb23396f147af181ffc3b1b79dcf93147851dd3"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.839109 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" event={"ID":"67f2c73f-2335-4f37-86ba-0dec25c93c9e","Type":"ContainerStarted","Data":"fc898ce93763a9f596cba20839fba0c18dce09c520d85e183990464b6181842e"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.858651 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq" event={"ID":"5e088d04-9b0c-45d3-8b98-d97b4065418c","Type":"ContainerStarted","Data":"046c38de2f8e55c4029af1e7ea43d47117656d1d850cec6eeed44f43f9ce25e7"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.858688 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq" event={"ID":"5e088d04-9b0c-45d3-8b98-d97b4065418c","Type":"ContainerStarted","Data":"7ab9d22ce2d4484be6b0512d99862148e3d933073829894f9979b523c9f6b890"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.869512 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8tph2" event={"ID":"479aaab8-5f0c-4c81-9f9e-60b3ca4e2ec0","Type":"ContainerStarted","Data":"f89f95c9cbb05c2b7252dc5e92e503fd70d4311efab9837a4019700572e535d2"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.869563 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8tph2" event={"ID":"479aaab8-5f0c-4c81-9f9e-60b3ca4e2ec0","Type":"ContainerStarted","Data":"88d6fecf834305f1f4a2e12f4e572d0c1d10d78fe7c7417a69ea486f7a1c2ea6"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.871009 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:05 crc kubenswrapper[4885]: E1205 20:08:05.871337 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:06.371323057 +0000 UTC m=+151.668138718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.881472 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nznv9" event={"ID":"5cf77011-0fae-4535-8564-b6393dfe49bb","Type":"ContainerStarted","Data":"1f1f221abd0bfab9f213569e579b8251d4d8e36e6e94cdf6cf3520861fe1d8bd"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.901491 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz" event={"ID":"a5953ed8-08cc-443c-a9e0-be5f96f3d8dd","Type":"ContainerStarted","Data":"1dcf717eec18bbce9e6c9288d30d9c50a45f28ef70306cfa374a137aa37048f5"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.920791 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" event={"ID":"dc1ce980-9bdc-4b28-9f12-ab17b79b981c","Type":"ContainerStarted","Data":"3b5371a3840b918cb25ec0d1d69b491a0f62cfc5f4b8dfe32056fab137abfd34"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.935331 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-h6jst" event={"ID":"61b3a667-550a-4955-91cc-8cd14d2a77c0","Type":"ContainerStarted","Data":"b44d325b90e900016b6c64dfb3a48ca4572ab2f8e52f1e0934ced4a5c9815639"} Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.951447 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:08:05 crc kubenswrapper[4885]: I1205 20:08:05.976307 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:05 crc kubenswrapper[4885]: E1205 20:08:05.978809 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:06.478795024 +0000 UTC m=+151.775610685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.017996 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-z2sb2" podStartSLOduration=129.017978308 podStartE2EDuration="2m9.017978308s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:06.017537403 +0000 UTC m=+151.314353064" watchObservedRunningTime="2025-12-05 20:08:06.017978308 +0000 UTC m=+151.314793969" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.019749 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" podStartSLOduration=129.019737296 podStartE2EDuration="2m9.019737296s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:05.979203317 +0000 UTC m=+151.276018978" watchObservedRunningTime="2025-12-05 20:08:06.019737296 +0000 UTC m=+151.316552957" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.052203 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" podStartSLOduration=129.052188176 podStartE2EDuration="2m9.052188176s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:06.051279776 +0000 UTC m=+151.348095447" watchObservedRunningTime="2025-12-05 20:08:06.052188176 +0000 UTC m=+151.349003837" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.077229 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:06 crc kubenswrapper[4885]: E1205 20:08:06.077722 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:06.577702934 +0000 UTC m=+151.874518595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.114996 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-h6jst" podStartSLOduration=7.114972075 podStartE2EDuration="7.114972075s" podCreationTimestamp="2025-12-05 20:07:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:06.109382459 +0000 UTC m=+151.406198130" watchObservedRunningTime="2025-12-05 20:08:06.114972075 +0000 UTC m=+151.411787736" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.175518 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z7z2s" podStartSLOduration=129.17550473 podStartE2EDuration="2m9.17550473s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:06.17341875 +0000 UTC m=+151.470234421" watchObservedRunningTime="2025-12-05 20:08:06.17550473 +0000 UTC m=+151.472320391" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.178638 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:06 crc kubenswrapper[4885]: E1205 20:08:06.178913 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:06.678897732 +0000 UTC m=+151.975713393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.231607 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-94qs9" podStartSLOduration=129.231589366 podStartE2EDuration="2m9.231589366s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:06.213125441 +0000 UTC m=+151.509941102" watchObservedRunningTime="2025-12-05 20:08:06.231589366 +0000 UTC m=+151.528405027" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.279563 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:06 crc kubenswrapper[4885]: E1205 20:08:06.280001 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:06.779984866 +0000 UTC m=+152.076800527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.312866 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dcdhz" podStartSLOduration=129.31284566 podStartE2EDuration="2m9.31284566s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:06.283036478 +0000 UTC m=+151.579852139" watchObservedRunningTime="2025-12-05 20:08:06.31284566 +0000 UTC m=+151.609661321" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.313470 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" podStartSLOduration=129.31346413 podStartE2EDuration="2m9.31346413s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:06.311736993 +0000 UTC m=+151.608552654" watchObservedRunningTime="2025-12-05 20:08:06.31346413 +0000 UTC m=+151.610279801" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.380708 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:06 crc kubenswrapper[4885]: E1205 20:08:06.381036 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:06.881006038 +0000 UTC m=+152.177821699 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.395989 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9n5ft" podStartSLOduration=129.395972986 podStartE2EDuration="2m9.395972986s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:06.394780166 +0000 UTC m=+151.691595827" watchObservedRunningTime="2025-12-05 20:08:06.395972986 +0000 UTC m=+151.692788647" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.444891 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fgdjd" podStartSLOduration=129.444873503 podStartE2EDuration="2m9.444873503s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:06.444275274 +0000 UTC m=+151.741090935" watchObservedRunningTime="2025-12-05 20:08:06.444873503 +0000 UTC m=+151.741689164" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.467609 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.492551 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.501159 4885 patch_prober.go:28] interesting pod/console-operator-58897d9998-z2sb2 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.501244 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-z2sb2" podUID="6086abe8-3970-4d1c-9f3f-8075de87b8ec" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.507267 4885 patch_prober.go:28] interesting pod/router-default-5444994796-89b4n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:08:06 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Dec 05 20:08:06 crc kubenswrapper[4885]: [+]process-running ok Dec 05 20:08:06 crc kubenswrapper[4885]: healthz check failed Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.507329 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-89b4n" podUID="f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:08:06 crc kubenswrapper[4885]: E1205 20:08:06.508692 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:07.008675157 +0000 UTC m=+152.305490818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.534895 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8tph2" podStartSLOduration=7.534877959 podStartE2EDuration="7.534877959s" podCreationTimestamp="2025-12-05 20:07:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:06.534344751 +0000 UTC m=+151.831160412" watchObservedRunningTime="2025-12-05 20:08:06.534877959 +0000 UTC m=+151.831693620" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.537995 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-89b4n" podStartSLOduration=129.537986772 podStartE2EDuration="2m9.537986772s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:06.507316452 +0000 UTC m=+151.804132113" watchObservedRunningTime="2025-12-05 20:08:06.537986772 +0000 UTC m=+151.834802433" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.593392 4885 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-qcd9b container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.16:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.593440 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" podUID="c6e3f1cc-5218-44b2-b4bf-168dae1629b7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.16:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.613684 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:06 crc kubenswrapper[4885]: E1205 20:08:06.614074 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:07.114061944 +0000 UTC m=+152.410877605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.632189 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.632612 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.708307 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.714787 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:06 crc kubenswrapper[4885]: E1205 20:08:06.715353 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:07.215323434 +0000 UTC m=+152.512139095 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.816077 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:06 crc kubenswrapper[4885]: E1205 20:08:06.816441 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:07.316428609 +0000 UTC m=+152.613244270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.918031 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:06 crc kubenswrapper[4885]: E1205 20:08:06.918159 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:07.418133152 +0000 UTC m=+152.714948803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.918462 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:06 crc kubenswrapper[4885]: E1205 20:08:06.918795 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:07.418782884 +0000 UTC m=+152.715598545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.977756 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" event={"ID":"24653880-7b0f-4174-ac74-5d13d99975e9","Type":"ContainerStarted","Data":"2fe0cb2d57712764833727f364b73e4ad5357777685d2f90117d49f59a71dedf"} Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.977808 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" event={"ID":"24653880-7b0f-4174-ac74-5d13d99975e9","Type":"ContainerStarted","Data":"d144130c6829ee0a1c452dcdaf652498342f4e0b181a857b945e715fa2425902"} Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.981588 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nznv9" event={"ID":"5cf77011-0fae-4535-8564-b6393dfe49bb","Type":"ContainerStarted","Data":"90a6c35f318e28f6f3878917ec3b721ad720fe91575148c3e01a29199ce969ab"} Dec 05 20:08:06 crc kubenswrapper[4885]: I1205 20:08:06.998595 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hfsls" event={"ID":"1ad3cb2f-89ef-4f6e-9d48-f3eb33e4581c","Type":"ContainerStarted","Data":"f5a2304ee767615fac875045afc2b4b79d8214dfdcc99adb12d80c0a702f3fc5"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.003997 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" event={"ID":"bf46954d-2487-49ae-92ab-38c47c77c9c2","Type":"ContainerStarted","Data":"ab2da5e5cb6ab14fcd7fdfc9a7cfb9b3b55afede0be3f881e9f513203213e743"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.004860 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.006461 4885 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pcn44 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.006518 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" podUID="bf46954d-2487-49ae-92ab-38c47c77c9c2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.011467 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" event={"ID":"3e2c6d12-1e18-498c-82e4-9c778e7c4aea","Type":"ContainerStarted","Data":"235e6c66258d2b260840a0b140f97d224d738da39eaf02e97a84ddab1029330f"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.016328 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl" event={"ID":"54e1caa8-222c-4e43-a2e5-c38cd995eaf1","Type":"ContainerStarted","Data":"d7de3f0f2822fa98294f5200fb3e9ddbd96dd4533ceb91cc3bbf8f15a6c4f1bd"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.016372 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl" event={"ID":"54e1caa8-222c-4e43-a2e5-c38cd995eaf1","Type":"ContainerStarted","Data":"61249de03fa3c671d86686161ee9a7a5385554888616a9b60b8c87a4001f60ea"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.016966 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.019291 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:07 crc kubenswrapper[4885]: E1205 20:08:07.019681 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:07.519653051 +0000 UTC m=+152.816468712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.021127 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nznv9" podStartSLOduration=130.021113149 podStartE2EDuration="2m10.021113149s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:07.017454048 +0000 UTC m=+152.314269719" watchObservedRunningTime="2025-12-05 20:08:07.021113149 +0000 UTC m=+152.317928810" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.032615 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" event={"ID":"67f2c73f-2335-4f37-86ba-0dec25c93c9e","Type":"ContainerStarted","Data":"3140acac9578b548084ce025a77db24227655b97e50bb491612ef6f15bdb7e89"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.034094 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.037209 4885 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mjg4d container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.037248 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" podUID="67f2c73f-2335-4f37-86ba-0dec25c93c9e" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.048640 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mfhrt" event={"ID":"0b74d087-5bad-4236-bad9-9808abed29e2","Type":"ContainerStarted","Data":"724b5c56642e58fd86362ffb56fdfb3c798f5ae8851c8b77a1134ec911884968"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.058741 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb" event={"ID":"3f5d3875-9256-494d-a41d-5f4011c4462d","Type":"ContainerStarted","Data":"2f7735606c94403f64814b18d3708f8292dc328481d42939e75b3f475d402c81"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.083388 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" podStartSLOduration=130.083368721 podStartE2EDuration="2m10.083368721s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:07.058121541 +0000 UTC m=+152.354937212" watchObservedRunningTime="2025-12-05 20:08:07.083368721 +0000 UTC m=+152.380184382" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.087621 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wdpql" event={"ID":"6e273248-d3b5-4248-9e30-06ae7c6ab889","Type":"ContainerStarted","Data":"2a6d591388a761c7091a66d43470124836dacce4136b55b2b61491284a4c078d"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.104205 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-x24g6" event={"ID":"b6be43bc-4f78-415c-abf8-c18e9bb5e21c","Type":"ContainerStarted","Data":"0f0e3030528bc69fc01705653a8198041e0f74eafca5f0dcbce08dd87bed05e1"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.104998 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" podStartSLOduration=131.104979361 podStartE2EDuration="2m11.104979361s" podCreationTimestamp="2025-12-05 20:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:07.083083402 +0000 UTC m=+152.379899073" watchObservedRunningTime="2025-12-05 20:08:07.104979361 +0000 UTC m=+152.401795022" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.107112 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hfsls" podStartSLOduration=130.107089081 podStartE2EDuration="2m10.107089081s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:07.104918129 +0000 UTC m=+152.401733790" watchObservedRunningTime="2025-12-05 20:08:07.107089081 +0000 UTC m=+152.403904742" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.123369 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:07 crc kubenswrapper[4885]: E1205 20:08:07.124989 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:07.624975766 +0000 UTC m=+152.921791427 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.133086 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl" podStartSLOduration=130.133066586 podStartE2EDuration="2m10.133066586s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:07.132844128 +0000 UTC m=+152.429659789" watchObservedRunningTime="2025-12-05 20:08:07.133066586 +0000 UTC m=+152.429882247" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.134770 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq" event={"ID":"5e088d04-9b0c-45d3-8b98-d97b4065418c","Type":"ContainerStarted","Data":"f1bf9f71d18df74409098da54e5e41b0c0da55df8fa33f454c47c5574c331994"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.195096 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-wdpql" podStartSLOduration=130.195079669 podStartE2EDuration="2m10.195079669s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:07.169093275 +0000 UTC m=+152.465908936" watchObservedRunningTime="2025-12-05 20:08:07.195079669 +0000 UTC m=+152.491895330" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.195743 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-x24g6" podStartSLOduration=130.195738651 podStartE2EDuration="2m10.195738651s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:07.194180069 +0000 UTC m=+152.490995730" watchObservedRunningTime="2025-12-05 20:08:07.195738651 +0000 UTC m=+152.492554322" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.199464 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8z75j" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.199511 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" event={"ID":"e7f4b9f7-b478-4e60-a533-67a7ab786f86","Type":"ContainerStarted","Data":"52135cf810cac1e99b25a043e8f552f3d62d4ef53fb111fff9892adba18d18fb"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.199532 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" event={"ID":"e7f4b9f7-b478-4e60-a533-67a7ab786f86","Type":"ContainerStarted","Data":"98589d417788432600d0819e1b57feeb0344d16f42a74fa54ea562e284e486ec"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.208244 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x" event={"ID":"0c8d8e49-9ca8-425b-ac37-9409980c4ff7","Type":"ContainerStarted","Data":"cd8c071d45b57cabdde59aa84255a28ad0c7fd77b63bd2ff7df6a081b38af821"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.208305 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x" event={"ID":"0c8d8e49-9ca8-425b-ac37-9409980c4ff7","Type":"ContainerStarted","Data":"5f3154067dff0b0f05712f87b24d6e6d772b6fc22e14e98b3fb9861654225f22"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.223808 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" podStartSLOduration=130.223793664 podStartE2EDuration="2m10.223793664s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:07.222652487 +0000 UTC m=+152.519468148" watchObservedRunningTime="2025-12-05 20:08:07.223793664 +0000 UTC m=+152.520609325" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.226501 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:07 crc kubenswrapper[4885]: E1205 20:08:07.227528 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:07.727507478 +0000 UTC m=+153.024323139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.231275 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c9jnh" event={"ID":"3e2298f3-9cb9-4307-b3ab-1e9eb349788b","Type":"ContainerStarted","Data":"4090faa7dcc81d01c1a13d7b8771446d93a7e7bc43d911f602f5e258538666cc"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.251919 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv5vb" podStartSLOduration=130.25190257 podStartE2EDuration="2m10.25190257s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:07.250626988 +0000 UTC m=+152.547442649" watchObservedRunningTime="2025-12-05 20:08:07.25190257 +0000 UTC m=+152.548718231" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.252519 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" event={"ID":"f395140e-7a1c-45ce-8eab-9d11bf757838","Type":"ContainerStarted","Data":"72b9d94874b5e964074dbf2ba66c0aabaa4dbc7ba0f7c6893269582db52d9fb1"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.269673 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" event={"ID":"106ffd61-239f-4707-b999-aa044f6f30ae","Type":"ContainerStarted","Data":"8e1392383c19bfc5439cf6a03b16f4e7128a7e48f79ec146434f29359e401e0f"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.270639 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.271466 4885 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-n7qfd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.271523 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" podUID="106ffd61-239f-4707-b999-aa044f6f30ae" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.294643 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-x67dc" event={"ID":"fa341b17-db5b-487c-9a0a-446a1842a78e","Type":"ContainerStarted","Data":"676bc06e8c32cc67241cbd23513762260ea0fd315a76cf6a25b5402a45d79b23"} Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.310397 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.312843 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6r5qr" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.313253 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7w97v" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.328824 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kggxw" podStartSLOduration=130.328802539 podStartE2EDuration="2m10.328802539s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:07.326717779 +0000 UTC m=+152.623533450" watchObservedRunningTime="2025-12-05 20:08:07.328802539 +0000 UTC m=+152.625618200" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.330533 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:07 crc kubenswrapper[4885]: E1205 20:08:07.341621 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:07.841600165 +0000 UTC m=+153.138415826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.348292 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-z2sb2" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.370314 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-45rpq" podStartSLOduration=130.37029057 podStartE2EDuration="2m10.37029057s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:07.366448302 +0000 UTC m=+152.663263993" watchObservedRunningTime="2025-12-05 20:08:07.37029057 +0000 UTC m=+152.667106231" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.429178 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8qk5x" podStartSLOduration=130.429159949 podStartE2EDuration="2m10.429159949s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:07.406383891 +0000 UTC m=+152.703199552" watchObservedRunningTime="2025-12-05 20:08:07.429159949 +0000 UTC m=+152.725975610" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.431587 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:07 crc kubenswrapper[4885]: E1205 20:08:07.431863 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:07.931847778 +0000 UTC m=+153.228663439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.464364 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" podStartSLOduration=130.4643478 podStartE2EDuration="2m10.4643478s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:07.428072862 +0000 UTC m=+152.724888523" watchObservedRunningTime="2025-12-05 20:08:07.4643478 +0000 UTC m=+152.761163461" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.475118 4885 patch_prober.go:28] interesting pod/router-default-5444994796-89b4n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:08:07 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Dec 05 20:08:07 crc kubenswrapper[4885]: [+]process-running ok Dec 05 20:08:07 crc kubenswrapper[4885]: healthz check failed Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.475180 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-89b4n" podUID="f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.526148 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-x67dc" podStartSLOduration=130.526124315 podStartE2EDuration="2m10.526124315s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:07.524571114 +0000 UTC m=+152.821386775" watchObservedRunningTime="2025-12-05 20:08:07.526124315 +0000 UTC m=+152.822939976" Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.533869 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:07 crc kubenswrapper[4885]: E1205 20:08:07.534238 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:08.034226935 +0000 UTC m=+153.331042596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.635530 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:07 crc kubenswrapper[4885]: E1205 20:08:07.635831 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:08.135815736 +0000 UTC m=+153.432631397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.736444 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:07 crc kubenswrapper[4885]: E1205 20:08:07.736772 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:08.236760955 +0000 UTC m=+153.533576616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.837976 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:07 crc kubenswrapper[4885]: E1205 20:08:07.838331 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:08.338313515 +0000 UTC m=+153.635129176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:07 crc kubenswrapper[4885]: I1205 20:08:07.939517 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:07 crc kubenswrapper[4885]: E1205 20:08:07.939973 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:08.439957697 +0000 UTC m=+153.736773358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.040376 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:08 crc kubenswrapper[4885]: E1205 20:08:08.040553 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:08.540528913 +0000 UTC m=+153.837344574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.040659 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:08 crc kubenswrapper[4885]: E1205 20:08:08.041009 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:08.541001679 +0000 UTC m=+153.837817340 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.141760 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:08 crc kubenswrapper[4885]: E1205 20:08:08.142056 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:08.642040322 +0000 UTC m=+153.938855983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.243191 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:08 crc kubenswrapper[4885]: E1205 20:08:08.243568 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:08.74355215 +0000 UTC m=+154.040367811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.301380 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" event={"ID":"f395140e-7a1c-45ce-8eab-9d11bf757838","Type":"ContainerStarted","Data":"fe7261b925b625291080ef0bd2d687a711556a827551e1fd2f9efffd32aa74a3"} Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.302830 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" event={"ID":"24653880-7b0f-4174-ac74-5d13d99975e9","Type":"ContainerStarted","Data":"fa87806d9d8604821f3fb7c9ccbc23f7aff04b1299c89976282809169e4325c3"} Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.306565 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mfhrt" event={"ID":"0b74d087-5bad-4236-bad9-9808abed29e2","Type":"ContainerStarted","Data":"7ce9176bf861cf1924a9d076a135cecc68f536f7ba438e36e1435ce2d6aa0cb4"} Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.306989 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-mfhrt" Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.313184 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c9jnh" event={"ID":"3e2298f3-9cb9-4307-b3ab-1e9eb349788b","Type":"ContainerStarted","Data":"f22ae63a261836d46c6f24d8577f8f41eeb1749aca35943c0d508e12a116f47c"} Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.313878 4885 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-n7qfd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.313905 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" podUID="106ffd61-239f-4707-b999-aa044f6f30ae" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.326169 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-vs7jr" podStartSLOduration=131.326149948 podStartE2EDuration="2m11.326149948s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:08.325135725 +0000 UTC m=+153.621951386" watchObservedRunningTime="2025-12-05 20:08:08.326149948 +0000 UTC m=+153.622965599" Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.332189 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjg4d" Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.343626 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:08 crc kubenswrapper[4885]: E1205 20:08:08.349141 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:08.849125693 +0000 UTC m=+154.145941354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.379740 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mfhrt" podStartSLOduration=9.379722131 podStartE2EDuration="9.379722131s" podCreationTimestamp="2025-12-05 20:07:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:08.373945999 +0000 UTC m=+153.670761650" watchObservedRunningTime="2025-12-05 20:08:08.379722131 +0000 UTC m=+153.676537792" Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.392621 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-c9jnh" podStartSLOduration=131.39259832 podStartE2EDuration="2m11.39259832s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:08.348568534 +0000 UTC m=+153.645384195" watchObservedRunningTime="2025-12-05 20:08:08.39259832 +0000 UTC m=+153.689413981" Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.394192 4885 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.445698 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:08 crc kubenswrapper[4885]: E1205 20:08:08.445990 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:08.945979987 +0000 UTC m=+154.242795648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.470495 4885 patch_prober.go:28] interesting pod/router-default-5444994796-89b4n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:08:08 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Dec 05 20:08:08 crc kubenswrapper[4885]: [+]process-running ok Dec 05 20:08:08 crc kubenswrapper[4885]: healthz check failed Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.470550 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-89b4n" podUID="f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.548564 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:08 crc kubenswrapper[4885]: E1205 20:08:08.548885 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:09.048869901 +0000 UTC m=+154.345685562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.643559 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pcn44" Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.650437 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:08 crc kubenswrapper[4885]: E1205 20:08:08.650759 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:09.15074645 +0000 UTC m=+154.447562111 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.751263 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:08 crc kubenswrapper[4885]: E1205 20:08:08.751571 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:09.251545615 +0000 UTC m=+154.548361276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.751796 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:08 crc kubenswrapper[4885]: E1205 20:08:08.752229 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:09.252215087 +0000 UTC m=+154.549030748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.852446 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:08 crc kubenswrapper[4885]: E1205 20:08:08.852701 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:09.3526769 +0000 UTC m=+154.649492561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.853329 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:08 crc kubenswrapper[4885]: E1205 20:08:08.853701 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:09.353686294 +0000 UTC m=+154.650501955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.955004 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:08 crc kubenswrapper[4885]: E1205 20:08:08.955224 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:09.455196923 +0000 UTC m=+154.752012584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:08 crc kubenswrapper[4885]: I1205 20:08:08.955449 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:08 crc kubenswrapper[4885]: E1205 20:08:08.955829 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:09.455817352 +0000 UTC m=+154.752633013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.056502 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:09 crc kubenswrapper[4885]: E1205 20:08:09.056745 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:09.556727971 +0000 UTC m=+154.853543632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.057079 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:09 crc kubenswrapper[4885]: E1205 20:08:09.057380 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:09.557368752 +0000 UTC m=+154.854184413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.088283 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-24244"] Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.089535 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24244" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.091495 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.116740 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-24244"] Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.157917 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.158297 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/746679e1-b958-4320-bc6c-00060a83db3f-catalog-content\") pod \"community-operators-24244\" (UID: \"746679e1-b958-4320-bc6c-00060a83db3f\") " pod="openshift-marketplace/community-operators-24244" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.158339 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tq47\" (UniqueName: \"kubernetes.io/projected/746679e1-b958-4320-bc6c-00060a83db3f-kube-api-access-9tq47\") pod \"community-operators-24244\" (UID: \"746679e1-b958-4320-bc6c-00060a83db3f\") " pod="openshift-marketplace/community-operators-24244" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.158363 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/746679e1-b958-4320-bc6c-00060a83db3f-utilities\") pod \"community-operators-24244\" (UID: \"746679e1-b958-4320-bc6c-00060a83db3f\") " pod="openshift-marketplace/community-operators-24244" Dec 05 20:08:09 crc kubenswrapper[4885]: E1205 20:08:09.158480 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:08:09.658462566 +0000 UTC m=+154.955278227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.259392 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tq47\" (UniqueName: \"kubernetes.io/projected/746679e1-b958-4320-bc6c-00060a83db3f-kube-api-access-9tq47\") pod \"community-operators-24244\" (UID: \"746679e1-b958-4320-bc6c-00060a83db3f\") " pod="openshift-marketplace/community-operators-24244" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.259441 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/746679e1-b958-4320-bc6c-00060a83db3f-utilities\") pod \"community-operators-24244\" (UID: \"746679e1-b958-4320-bc6c-00060a83db3f\") " pod="openshift-marketplace/community-operators-24244" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.259543 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.259576 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/746679e1-b958-4320-bc6c-00060a83db3f-catalog-content\") pod \"community-operators-24244\" (UID: \"746679e1-b958-4320-bc6c-00060a83db3f\") " pod="openshift-marketplace/community-operators-24244" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.260054 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/746679e1-b958-4320-bc6c-00060a83db3f-catalog-content\") pod \"community-operators-24244\" (UID: \"746679e1-b958-4320-bc6c-00060a83db3f\") " pod="openshift-marketplace/community-operators-24244" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.260611 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/746679e1-b958-4320-bc6c-00060a83db3f-utilities\") pod \"community-operators-24244\" (UID: \"746679e1-b958-4320-bc6c-00060a83db3f\") " pod="openshift-marketplace/community-operators-24244" Dec 05 20:08:09 crc kubenswrapper[4885]: E1205 20:08:09.260892 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:08:09.760876234 +0000 UTC m=+155.057691905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bctth" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.278124 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tq47\" (UniqueName: \"kubernetes.io/projected/746679e1-b958-4320-bc6c-00060a83db3f-kube-api-access-9tq47\") pod \"community-operators-24244\" (UID: \"746679e1-b958-4320-bc6c-00060a83db3f\") " pod="openshift-marketplace/community-operators-24244" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.283263 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-28gdb"] Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.284216 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.285510 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.293692 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-28gdb"] Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.321894 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" event={"ID":"f395140e-7a1c-45ce-8eab-9d11bf757838","Type":"ContainerStarted","Data":"a3cab85c6ef20a939181fa0105fdba627b755097e958983a5f47f6f13eb8d339"} Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.321942 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" event={"ID":"f395140e-7a1c-45ce-8eab-9d11bf757838","Type":"ContainerStarted","Data":"53bc35daa1cd65b38e1a30e8fe31ae44066b91ec14ee005a62a94fe03610d42d"} Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.325427 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.341746 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-bvl5h" podStartSLOduration=10.341728246 podStartE2EDuration="10.341728246s" podCreationTimestamp="2025-12-05 20:07:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:09.340793144 +0000 UTC m=+154.637608805" watchObservedRunningTime="2025-12-05 20:08:09.341728246 +0000 UTC m=+154.638543907" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.353438 4885 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-05T20:08:08.394221214Z","Handler":null,"Name":""} Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.355599 4885 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.355639 4885 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.360544 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.360730 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f67e39-acf3-4ec4-af3f-68159973345e-utilities\") pod \"certified-operators-28gdb\" (UID: \"61f67e39-acf3-4ec4-af3f-68159973345e\") " pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.360912 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qlr4\" (UniqueName: \"kubernetes.io/projected/61f67e39-acf3-4ec4-af3f-68159973345e-kube-api-access-5qlr4\") pod \"certified-operators-28gdb\" (UID: \"61f67e39-acf3-4ec4-af3f-68159973345e\") " pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.361080 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f67e39-acf3-4ec4-af3f-68159973345e-catalog-content\") pod \"certified-operators-28gdb\" (UID: \"61f67e39-acf3-4ec4-af3f-68159973345e\") " pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.395867 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.403007 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24244" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.462124 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f67e39-acf3-4ec4-af3f-68159973345e-utilities\") pod \"certified-operators-28gdb\" (UID: \"61f67e39-acf3-4ec4-af3f-68159973345e\") " pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.462487 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.462514 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qlr4\" (UniqueName: \"kubernetes.io/projected/61f67e39-acf3-4ec4-af3f-68159973345e-kube-api-access-5qlr4\") pod \"certified-operators-28gdb\" (UID: \"61f67e39-acf3-4ec4-af3f-68159973345e\") " pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.462545 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f67e39-acf3-4ec4-af3f-68159973345e-catalog-content\") pod \"certified-operators-28gdb\" (UID: \"61f67e39-acf3-4ec4-af3f-68159973345e\") " pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.462609 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f67e39-acf3-4ec4-af3f-68159973345e-utilities\") pod \"certified-operators-28gdb\" (UID: \"61f67e39-acf3-4ec4-af3f-68159973345e\") " pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.462931 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f67e39-acf3-4ec4-af3f-68159973345e-catalog-content\") pod \"certified-operators-28gdb\" (UID: \"61f67e39-acf3-4ec4-af3f-68159973345e\") " pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.464636 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.464665 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.482900 4885 patch_prober.go:28] interesting pod/router-default-5444994796-89b4n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:08:09 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Dec 05 20:08:09 crc kubenswrapper[4885]: [+]process-running ok Dec 05 20:08:09 crc kubenswrapper[4885]: healthz check failed Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.482958 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-89b4n" podUID="f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.491003 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-76bdk"] Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.492327 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.492940 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qlr4\" (UniqueName: \"kubernetes.io/projected/61f67e39-acf3-4ec4-af3f-68159973345e-kube-api-access-5qlr4\") pod \"certified-operators-28gdb\" (UID: \"61f67e39-acf3-4ec4-af3f-68159973345e\") " pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.507971 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-76bdk"] Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.515041 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bctth\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.563406 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba098ab6-d9df-4d50-aaa6-085658e80871-catalog-content\") pod \"community-operators-76bdk\" (UID: \"ba098ab6-d9df-4d50-aaa6-085658e80871\") " pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.563466 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h987n\" (UniqueName: \"kubernetes.io/projected/ba098ab6-d9df-4d50-aaa6-085658e80871-kube-api-access-h987n\") pod \"community-operators-76bdk\" (UID: \"ba098ab6-d9df-4d50-aaa6-085658e80871\") " pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.563498 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba098ab6-d9df-4d50-aaa6-085658e80871-utilities\") pod \"community-operators-76bdk\" (UID: \"ba098ab6-d9df-4d50-aaa6-085658e80871\") " pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.608884 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.610509 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-24244"] Dec 05 20:08:09 crc kubenswrapper[4885]: W1205 20:08:09.623987 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod746679e1_b958_4320_bc6c_00060a83db3f.slice/crio-f89da8872abaa119890bed0b796f6f54beaa63515ff2798b8bf2cb8b22d8ee9c WatchSource:0}: Error finding container f89da8872abaa119890bed0b796f6f54beaa63515ff2798b8bf2cb8b22d8ee9c: Status 404 returned error can't find the container with id f89da8872abaa119890bed0b796f6f54beaa63515ff2798b8bf2cb8b22d8ee9c Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.664534 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h987n\" (UniqueName: \"kubernetes.io/projected/ba098ab6-d9df-4d50-aaa6-085658e80871-kube-api-access-h987n\") pod \"community-operators-76bdk\" (UID: \"ba098ab6-d9df-4d50-aaa6-085658e80871\") " pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.664595 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba098ab6-d9df-4d50-aaa6-085658e80871-utilities\") pod \"community-operators-76bdk\" (UID: \"ba098ab6-d9df-4d50-aaa6-085658e80871\") " pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.665131 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba098ab6-d9df-4d50-aaa6-085658e80871-utilities\") pod \"community-operators-76bdk\" (UID: \"ba098ab6-d9df-4d50-aaa6-085658e80871\") " pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.665532 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba098ab6-d9df-4d50-aaa6-085658e80871-catalog-content\") pod \"community-operators-76bdk\" (UID: \"ba098ab6-d9df-4d50-aaa6-085658e80871\") " pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.665825 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba098ab6-d9df-4d50-aaa6-085658e80871-catalog-content\") pod \"community-operators-76bdk\" (UID: \"ba098ab6-d9df-4d50-aaa6-085658e80871\") " pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.692285 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h987n\" (UniqueName: \"kubernetes.io/projected/ba098ab6-d9df-4d50-aaa6-085658e80871-kube-api-access-h987n\") pod \"community-operators-76bdk\" (UID: \"ba098ab6-d9df-4d50-aaa6-085658e80871\") " pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.698916 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bl5hc"] Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.700789 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bl5hc"] Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.700906 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.748615 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.768653 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-catalog-content\") pod \"certified-operators-bl5hc\" (UID: \"3bca36c4-e503-4b3f-aaeb-829cebc24e4c\") " pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.768719 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4nl4\" (UniqueName: \"kubernetes.io/projected/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-kube-api-access-s4nl4\") pod \"certified-operators-bl5hc\" (UID: \"3bca36c4-e503-4b3f-aaeb-829cebc24e4c\") " pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.768752 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-utilities\") pod \"certified-operators-bl5hc\" (UID: \"3bca36c4-e503-4b3f-aaeb-829cebc24e4c\") " pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.805443 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-28gdb"] Dec 05 20:08:09 crc kubenswrapper[4885]: W1205 20:08:09.837193 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61f67e39_acf3_4ec4_af3f_68159973345e.slice/crio-7d8ca4c12a9bf5340671faec319181cc1e7b0983146e63a473a9acf4d7700985 WatchSource:0}: Error finding container 7d8ca4c12a9bf5340671faec319181cc1e7b0983146e63a473a9acf4d7700985: Status 404 returned error can't find the container with id 7d8ca4c12a9bf5340671faec319181cc1e7b0983146e63a473a9acf4d7700985 Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.841459 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.869439 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-catalog-content\") pod \"certified-operators-bl5hc\" (UID: \"3bca36c4-e503-4b3f-aaeb-829cebc24e4c\") " pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.869510 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4nl4\" (UniqueName: \"kubernetes.io/projected/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-kube-api-access-s4nl4\") pod \"certified-operators-bl5hc\" (UID: \"3bca36c4-e503-4b3f-aaeb-829cebc24e4c\") " pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.869539 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-utilities\") pod \"certified-operators-bl5hc\" (UID: \"3bca36c4-e503-4b3f-aaeb-829cebc24e4c\") " pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.870374 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-catalog-content\") pod \"certified-operators-bl5hc\" (UID: \"3bca36c4-e503-4b3f-aaeb-829cebc24e4c\") " pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.870759 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-utilities\") pod \"certified-operators-bl5hc\" (UID: \"3bca36c4-e503-4b3f-aaeb-829cebc24e4c\") " pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.890214 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4nl4\" (UniqueName: \"kubernetes.io/projected/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-kube-api-access-s4nl4\") pod \"certified-operators-bl5hc\" (UID: \"3bca36c4-e503-4b3f-aaeb-829cebc24e4c\") " pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:08:09 crc kubenswrapper[4885]: I1205 20:08:09.974664 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bctth"] Dec 05 20:08:10 crc kubenswrapper[4885]: W1205 20:08:10.011574 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc98724fc_908e_4a61_bb2b_905c0f5709a5.slice/crio-7231f8b1bfa265d89f54bb8cda89353e2d9580675a3afcb7ad0d7bee453a1663 WatchSource:0}: Error finding container 7231f8b1bfa265d89f54bb8cda89353e2d9580675a3afcb7ad0d7bee453a1663: Status 404 returned error can't find the container with id 7231f8b1bfa265d89f54bb8cda89353e2d9580675a3afcb7ad0d7bee453a1663 Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.035886 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-76bdk"] Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.044193 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:08:10 crc kubenswrapper[4885]: W1205 20:08:10.045831 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba098ab6_d9df_4d50_aaa6_085658e80871.slice/crio-e32a3e71bd1c4103f742a621b7874f7a71266de42f95797ed45dccbc197c6c21 WatchSource:0}: Error finding container e32a3e71bd1c4103f742a621b7874f7a71266de42f95797ed45dccbc197c6c21: Status 404 returned error can't find the container with id e32a3e71bd1c4103f742a621b7874f7a71266de42f95797ed45dccbc197c6c21 Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.246854 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bl5hc"] Dec 05 20:08:10 crc kubenswrapper[4885]: W1205 20:08:10.254606 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bca36c4_e503_4b3f_aaeb_829cebc24e4c.slice/crio-60b51facfd94710ee72d5fa43ac7d310fc0be42eddd26c01b0449b49ac982e8f WatchSource:0}: Error finding container 60b51facfd94710ee72d5fa43ac7d310fc0be42eddd26c01b0449b49ac982e8f: Status 404 returned error can't find the container with id 60b51facfd94710ee72d5fa43ac7d310fc0be42eddd26c01b0449b49ac982e8f Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.328647 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bl5hc" event={"ID":"3bca36c4-e503-4b3f-aaeb-829cebc24e4c","Type":"ContainerStarted","Data":"60b51facfd94710ee72d5fa43ac7d310fc0be42eddd26c01b0449b49ac982e8f"} Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.331389 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bctth" event={"ID":"c98724fc-908e-4a61-bb2b-905c0f5709a5","Type":"ContainerStarted","Data":"3e57ef77fa8f3ece6249172da68606887bea5cd1584954260ea420ff9591b6ba"} Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.331446 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bctth" event={"ID":"c98724fc-908e-4a61-bb2b-905c0f5709a5","Type":"ContainerStarted","Data":"7231f8b1bfa265d89f54bb8cda89353e2d9580675a3afcb7ad0d7bee453a1663"} Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.331474 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.333354 4885 generic.go:334] "Generic (PLEG): container finished" podID="ba098ab6-d9df-4d50-aaa6-085658e80871" containerID="7839561cfb0c4f78ca86ff632075444f2afebc20c60f20aeb845dd09bb97f091" exitCode=0 Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.333403 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76bdk" event={"ID":"ba098ab6-d9df-4d50-aaa6-085658e80871","Type":"ContainerDied","Data":"7839561cfb0c4f78ca86ff632075444f2afebc20c60f20aeb845dd09bb97f091"} Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.333421 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76bdk" event={"ID":"ba098ab6-d9df-4d50-aaa6-085658e80871","Type":"ContainerStarted","Data":"e32a3e71bd1c4103f742a621b7874f7a71266de42f95797ed45dccbc197c6c21"} Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.335250 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.335883 4885 generic.go:334] "Generic (PLEG): container finished" podID="61f67e39-acf3-4ec4-af3f-68159973345e" containerID="9b484f4d1df1bfad4ba36b2ca3192b70c35da836f76d7ca70153a71bfb2d75b7" exitCode=0 Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.335928 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28gdb" event={"ID":"61f67e39-acf3-4ec4-af3f-68159973345e","Type":"ContainerDied","Data":"9b484f4d1df1bfad4ba36b2ca3192b70c35da836f76d7ca70153a71bfb2d75b7"} Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.335947 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28gdb" event={"ID":"61f67e39-acf3-4ec4-af3f-68159973345e","Type":"ContainerStarted","Data":"7d8ca4c12a9bf5340671faec319181cc1e7b0983146e63a473a9acf4d7700985"} Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.338545 4885 generic.go:334] "Generic (PLEG): container finished" podID="746679e1-b958-4320-bc6c-00060a83db3f" containerID="dc3c2c43f22e9bf622a4c01c331da8efb70e48d3624e9e9f042add3a80fd3256" exitCode=0 Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.339661 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24244" event={"ID":"746679e1-b958-4320-bc6c-00060a83db3f","Type":"ContainerDied","Data":"dc3c2c43f22e9bf622a4c01c331da8efb70e48d3624e9e9f042add3a80fd3256"} Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.339681 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24244" event={"ID":"746679e1-b958-4320-bc6c-00060a83db3f","Type":"ContainerStarted","Data":"f89da8872abaa119890bed0b796f6f54beaa63515ff2798b8bf2cb8b22d8ee9c"} Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.356231 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bctth" podStartSLOduration=133.356153913 podStartE2EDuration="2m13.356153913s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:10.35305181 +0000 UTC m=+155.649867491" watchObservedRunningTime="2025-12-05 20:08:10.356153913 +0000 UTC m=+155.652969594" Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.471733 4885 patch_prober.go:28] interesting pod/router-default-5444994796-89b4n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:08:10 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Dec 05 20:08:10 crc kubenswrapper[4885]: [+]process-running ok Dec 05 20:08:10 crc kubenswrapper[4885]: healthz check failed Dec 05 20:08:10 crc kubenswrapper[4885]: I1205 20:08:10.471822 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-89b4n" podUID="f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.089378 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g2dmb"] Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.090737 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.098265 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.106442 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g2dmb"] Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.124750 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-jdls9" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.193062 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61122263-3d9d-4510-87bc-6e8ff3bf7af5-utilities\") pod \"redhat-marketplace-g2dmb\" (UID: \"61122263-3d9d-4510-87bc-6e8ff3bf7af5\") " pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.193108 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61122263-3d9d-4510-87bc-6e8ff3bf7af5-catalog-content\") pod \"redhat-marketplace-g2dmb\" (UID: \"61122263-3d9d-4510-87bc-6e8ff3bf7af5\") " pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.193403 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4c8l\" (UniqueName: \"kubernetes.io/projected/61122263-3d9d-4510-87bc-6e8ff3bf7af5-kube-api-access-g4c8l\") pod \"redhat-marketplace-g2dmb\" (UID: \"61122263-3d9d-4510-87bc-6e8ff3bf7af5\") " pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.197356 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.260222 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.260276 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.265797 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.293014 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.293557 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.294896 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61122263-3d9d-4510-87bc-6e8ff3bf7af5-utilities\") pod \"redhat-marketplace-g2dmb\" (UID: \"61122263-3d9d-4510-87bc-6e8ff3bf7af5\") " pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.294973 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61122263-3d9d-4510-87bc-6e8ff3bf7af5-catalog-content\") pod \"redhat-marketplace-g2dmb\" (UID: \"61122263-3d9d-4510-87bc-6e8ff3bf7af5\") " pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.295057 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4c8l\" (UniqueName: \"kubernetes.io/projected/61122263-3d9d-4510-87bc-6e8ff3bf7af5-kube-api-access-g4c8l\") pod \"redhat-marketplace-g2dmb\" (UID: \"61122263-3d9d-4510-87bc-6e8ff3bf7af5\") " pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.295551 4885 patch_prober.go:28] interesting pod/console-f9d7485db-jdrlk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.295632 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-jdrlk" podUID="543415d6-6aec-42f4-953f-3a760aefe1f2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.296958 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61122263-3d9d-4510-87bc-6e8ff3bf7af5-catalog-content\") pod \"redhat-marketplace-g2dmb\" (UID: \"61122263-3d9d-4510-87bc-6e8ff3bf7af5\") " pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.298319 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61122263-3d9d-4510-87bc-6e8ff3bf7af5-utilities\") pod \"redhat-marketplace-g2dmb\" (UID: \"61122263-3d9d-4510-87bc-6e8ff3bf7af5\") " pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.316049 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4c8l\" (UniqueName: \"kubernetes.io/projected/61122263-3d9d-4510-87bc-6e8ff3bf7af5-kube-api-access-g4c8l\") pod \"redhat-marketplace-g2dmb\" (UID: \"61122263-3d9d-4510-87bc-6e8ff3bf7af5\") " pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.348807 4885 generic.go:334] "Generic (PLEG): container finished" podID="3e2c6d12-1e18-498c-82e4-9c778e7c4aea" containerID="235e6c66258d2b260840a0b140f97d224d738da39eaf02e97a84ddab1029330f" exitCode=0 Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.348884 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" event={"ID":"3e2c6d12-1e18-498c-82e4-9c778e7c4aea","Type":"ContainerDied","Data":"235e6c66258d2b260840a0b140f97d224d738da39eaf02e97a84ddab1029330f"} Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.353170 4885 generic.go:334] "Generic (PLEG): container finished" podID="3bca36c4-e503-4b3f-aaeb-829cebc24e4c" containerID="5af2c067aac8a771afac56b405d377d9832177a0728b1d5de84a72637ab0e60a" exitCode=0 Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.353219 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bl5hc" event={"ID":"3bca36c4-e503-4b3f-aaeb-829cebc24e4c","Type":"ContainerDied","Data":"5af2c067aac8a771afac56b405d377d9832177a0728b1d5de84a72637ab0e60a"} Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.357544 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-wdpql" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.427309 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.479729 4885 patch_prober.go:28] interesting pod/router-default-5444994796-89b4n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:08:11 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Dec 05 20:08:11 crc kubenswrapper[4885]: [+]process-running ok Dec 05 20:08:11 crc kubenswrapper[4885]: healthz check failed Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.479785 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-89b4n" podUID="f5fe14c1-a7b0-4f3c-bbb6-9863c38e0bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.518413 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pkhq9"] Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.519512 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.535377 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pkhq9"] Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.707337 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6724d\" (UniqueName: \"kubernetes.io/projected/42603535-a30f-41b2-96e3-10f3f8144003-kube-api-access-6724d\") pod \"redhat-marketplace-pkhq9\" (UID: \"42603535-a30f-41b2-96e3-10f3f8144003\") " pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.707871 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42603535-a30f-41b2-96e3-10f3f8144003-catalog-content\") pod \"redhat-marketplace-pkhq9\" (UID: \"42603535-a30f-41b2-96e3-10f3f8144003\") " pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.707969 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42603535-a30f-41b2-96e3-10f3f8144003-utilities\") pod \"redhat-marketplace-pkhq9\" (UID: \"42603535-a30f-41b2-96e3-10f3f8144003\") " pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.747791 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g2dmb"] Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.809440 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6724d\" (UniqueName: \"kubernetes.io/projected/42603535-a30f-41b2-96e3-10f3f8144003-kube-api-access-6724d\") pod \"redhat-marketplace-pkhq9\" (UID: \"42603535-a30f-41b2-96e3-10f3f8144003\") " pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.809493 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42603535-a30f-41b2-96e3-10f3f8144003-catalog-content\") pod \"redhat-marketplace-pkhq9\" (UID: \"42603535-a30f-41b2-96e3-10f3f8144003\") " pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.809559 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42603535-a30f-41b2-96e3-10f3f8144003-utilities\") pod \"redhat-marketplace-pkhq9\" (UID: \"42603535-a30f-41b2-96e3-10f3f8144003\") " pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.809936 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42603535-a30f-41b2-96e3-10f3f8144003-utilities\") pod \"redhat-marketplace-pkhq9\" (UID: \"42603535-a30f-41b2-96e3-10f3f8144003\") " pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.810488 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42603535-a30f-41b2-96e3-10f3f8144003-catalog-content\") pod \"redhat-marketplace-pkhq9\" (UID: \"42603535-a30f-41b2-96e3-10f3f8144003\") " pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.849758 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6724d\" (UniqueName: \"kubernetes.io/projected/42603535-a30f-41b2-96e3-10f3f8144003-kube-api-access-6724d\") pod \"redhat-marketplace-pkhq9\" (UID: \"42603535-a30f-41b2-96e3-10f3f8144003\") " pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:08:11 crc kubenswrapper[4885]: I1205 20:08:11.874519 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.152334 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pkhq9"] Dec 05 20:08:12 crc kubenswrapper[4885]: W1205 20:08:12.187974 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42603535_a30f_41b2_96e3_10f3f8144003.slice/crio-3e986a6fcb277c5260db378a5592b785e1d26f12b1936a4ab5613586e3701aa8 WatchSource:0}: Error finding container 3e986a6fcb277c5260db378a5592b785e1d26f12b1936a4ab5613586e3701aa8: Status 404 returned error can't find the container with id 3e986a6fcb277c5260db378a5592b785e1d26f12b1936a4ab5613586e3701aa8 Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.284156 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n9hmw"] Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.285160 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.289970 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.294820 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n9hmw"] Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.386367 4885 generic.go:334] "Generic (PLEG): container finished" podID="61122263-3d9d-4510-87bc-6e8ff3bf7af5" containerID="e839585263d166d270cc23b043c285f2a3b93e6e49d003877c9048e6d915008f" exitCode=0 Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.386444 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g2dmb" event={"ID":"61122263-3d9d-4510-87bc-6e8ff3bf7af5","Type":"ContainerDied","Data":"e839585263d166d270cc23b043c285f2a3b93e6e49d003877c9048e6d915008f"} Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.386509 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g2dmb" event={"ID":"61122263-3d9d-4510-87bc-6e8ff3bf7af5","Type":"ContainerStarted","Data":"f21e60121141644d8e9589e406844a3ec296032ed41ac770cc5ae1d5280ca5d8"} Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.395247 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkhq9" event={"ID":"42603535-a30f-41b2-96e3-10f3f8144003","Type":"ContainerStarted","Data":"edbd2b3f4a34ccb59ced1721b16410330d18855193863b8448984fd184c9494e"} Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.395568 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkhq9" event={"ID":"42603535-a30f-41b2-96e3-10f3f8144003","Type":"ContainerStarted","Data":"3e986a6fcb277c5260db378a5592b785e1d26f12b1936a4ab5613586e3701aa8"} Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.419661 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt28l\" (UniqueName: \"kubernetes.io/projected/60a86faf-f4ad-4e5a-b614-4c90d228b05f-kube-api-access-qt28l\") pod \"redhat-operators-n9hmw\" (UID: \"60a86faf-f4ad-4e5a-b614-4c90d228b05f\") " pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.419732 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a86faf-f4ad-4e5a-b614-4c90d228b05f-catalog-content\") pod \"redhat-operators-n9hmw\" (UID: \"60a86faf-f4ad-4e5a-b614-4c90d228b05f\") " pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.419765 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a86faf-f4ad-4e5a-b614-4c90d228b05f-utilities\") pod \"redhat-operators-n9hmw\" (UID: \"60a86faf-f4ad-4e5a-b614-4c90d228b05f\") " pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.467719 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.474939 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.521037 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a86faf-f4ad-4e5a-b614-4c90d228b05f-catalog-content\") pod \"redhat-operators-n9hmw\" (UID: \"60a86faf-f4ad-4e5a-b614-4c90d228b05f\") " pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.521117 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a86faf-f4ad-4e5a-b614-4c90d228b05f-utilities\") pod \"redhat-operators-n9hmw\" (UID: \"60a86faf-f4ad-4e5a-b614-4c90d228b05f\") " pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.521186 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt28l\" (UniqueName: \"kubernetes.io/projected/60a86faf-f4ad-4e5a-b614-4c90d228b05f-kube-api-access-qt28l\") pod \"redhat-operators-n9hmw\" (UID: \"60a86faf-f4ad-4e5a-b614-4c90d228b05f\") " pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.521666 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a86faf-f4ad-4e5a-b614-4c90d228b05f-catalog-content\") pod \"redhat-operators-n9hmw\" (UID: \"60a86faf-f4ad-4e5a-b614-4c90d228b05f\") " pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.523674 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a86faf-f4ad-4e5a-b614-4c90d228b05f-utilities\") pod \"redhat-operators-n9hmw\" (UID: \"60a86faf-f4ad-4e5a-b614-4c90d228b05f\") " pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.541347 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt28l\" (UniqueName: \"kubernetes.io/projected/60a86faf-f4ad-4e5a-b614-4c90d228b05f-kube-api-access-qt28l\") pod \"redhat-operators-n9hmw\" (UID: \"60a86faf-f4ad-4e5a-b614-4c90d228b05f\") " pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.607827 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.619126 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.687685 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rrdv6"] Dec 05 20:08:12 crc kubenswrapper[4885]: E1205 20:08:12.688093 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2c6d12-1e18-498c-82e4-9c778e7c4aea" containerName="collect-profiles" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.688180 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2c6d12-1e18-498c-82e4-9c778e7c4aea" containerName="collect-profiles" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.688365 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e2c6d12-1e18-498c-82e4-9c778e7c4aea" containerName="collect-profiles" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.689220 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.697062 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rrdv6"] Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.725778 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-secret-volume\") pod \"3e2c6d12-1e18-498c-82e4-9c778e7c4aea\" (UID: \"3e2c6d12-1e18-498c-82e4-9c778e7c4aea\") " Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.726177 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-config-volume\") pod \"3e2c6d12-1e18-498c-82e4-9c778e7c4aea\" (UID: \"3e2c6d12-1e18-498c-82e4-9c778e7c4aea\") " Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.726234 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjpnn\" (UniqueName: \"kubernetes.io/projected/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-kube-api-access-rjpnn\") pod \"3e2c6d12-1e18-498c-82e4-9c778e7c4aea\" (UID: \"3e2c6d12-1e18-498c-82e4-9c778e7c4aea\") " Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.727727 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-config-volume" (OuterVolumeSpecName: "config-volume") pod "3e2c6d12-1e18-498c-82e4-9c778e7c4aea" (UID: "3e2c6d12-1e18-498c-82e4-9c778e7c4aea"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.742954 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-kube-api-access-rjpnn" (OuterVolumeSpecName: "kube-api-access-rjpnn") pod "3e2c6d12-1e18-498c-82e4-9c778e7c4aea" (UID: "3e2c6d12-1e18-498c-82e4-9c778e7c4aea"). InnerVolumeSpecName "kube-api-access-rjpnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.743726 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3e2c6d12-1e18-498c-82e4-9c778e7c4aea" (UID: "3e2c6d12-1e18-498c-82e4-9c778e7c4aea"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.796382 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.797954 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.800938 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.804879 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.813559 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.827453 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f26a71ec-b73f-472e-9f1a-2baae67c0691-utilities\") pod \"redhat-operators-rrdv6\" (UID: \"f26a71ec-b73f-472e-9f1a-2baae67c0691\") " pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.827527 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4qvh\" (UniqueName: \"kubernetes.io/projected/f26a71ec-b73f-472e-9f1a-2baae67c0691-kube-api-access-t4qvh\") pod \"redhat-operators-rrdv6\" (UID: \"f26a71ec-b73f-472e-9f1a-2baae67c0691\") " pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.827695 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f26a71ec-b73f-472e-9f1a-2baae67c0691-catalog-content\") pod \"redhat-operators-rrdv6\" (UID: \"f26a71ec-b73f-472e-9f1a-2baae67c0691\") " pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.827821 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.827843 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.827855 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjpnn\" (UniqueName: \"kubernetes.io/projected/3e2c6d12-1e18-498c-82e4-9c778e7c4aea-kube-api-access-rjpnn\") on node \"crc\" DevicePath \"\"" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.921402 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n9hmw"] Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.928761 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41e47378-e96d-4343-81d4-3b936050774d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41e47378-e96d-4343-81d4-3b936050774d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.928824 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f26a71ec-b73f-472e-9f1a-2baae67c0691-catalog-content\") pod \"redhat-operators-rrdv6\" (UID: \"f26a71ec-b73f-472e-9f1a-2baae67c0691\") " pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.928857 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f26a71ec-b73f-472e-9f1a-2baae67c0691-utilities\") pod \"redhat-operators-rrdv6\" (UID: \"f26a71ec-b73f-472e-9f1a-2baae67c0691\") " pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.928885 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4qvh\" (UniqueName: \"kubernetes.io/projected/f26a71ec-b73f-472e-9f1a-2baae67c0691-kube-api-access-t4qvh\") pod \"redhat-operators-rrdv6\" (UID: \"f26a71ec-b73f-472e-9f1a-2baae67c0691\") " pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.928906 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41e47378-e96d-4343-81d4-3b936050774d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41e47378-e96d-4343-81d4-3b936050774d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.929355 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f26a71ec-b73f-472e-9f1a-2baae67c0691-catalog-content\") pod \"redhat-operators-rrdv6\" (UID: \"f26a71ec-b73f-472e-9f1a-2baae67c0691\") " pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.931664 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f26a71ec-b73f-472e-9f1a-2baae67c0691-utilities\") pod \"redhat-operators-rrdv6\" (UID: \"f26a71ec-b73f-472e-9f1a-2baae67c0691\") " pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:08:12 crc kubenswrapper[4885]: W1205 20:08:12.935672 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60a86faf_f4ad_4e5a_b614_4c90d228b05f.slice/crio-4f4480e98b75cb5cbdb27f8642ae40d8686d7f6b8b68ec3338fc34cd6da8161c WatchSource:0}: Error finding container 4f4480e98b75cb5cbdb27f8642ae40d8686d7f6b8b68ec3338fc34cd6da8161c: Status 404 returned error can't find the container with id 4f4480e98b75cb5cbdb27f8642ae40d8686d7f6b8b68ec3338fc34cd6da8161c Dec 05 20:08:12 crc kubenswrapper[4885]: I1205 20:08:12.953503 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4qvh\" (UniqueName: \"kubernetes.io/projected/f26a71ec-b73f-472e-9f1a-2baae67c0691-kube-api-access-t4qvh\") pod \"redhat-operators-rrdv6\" (UID: \"f26a71ec-b73f-472e-9f1a-2baae67c0691\") " pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.030870 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41e47378-e96d-4343-81d4-3b936050774d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41e47378-e96d-4343-81d4-3b936050774d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.030998 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41e47378-e96d-4343-81d4-3b936050774d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41e47378-e96d-4343-81d4-3b936050774d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.031233 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41e47378-e96d-4343-81d4-3b936050774d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41e47378-e96d-4343-81d4-3b936050774d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.051768 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.056158 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41e47378-e96d-4343-81d4-3b936050774d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41e47378-e96d-4343-81d4-3b936050774d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.117893 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.407000 4885 generic.go:334] "Generic (PLEG): container finished" podID="42603535-a30f-41b2-96e3-10f3f8144003" containerID="edbd2b3f4a34ccb59ced1721b16410330d18855193863b8448984fd184c9494e" exitCode=0 Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.407079 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkhq9" event={"ID":"42603535-a30f-41b2-96e3-10f3f8144003","Type":"ContainerDied","Data":"edbd2b3f4a34ccb59ced1721b16410330d18855193863b8448984fd184c9494e"} Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.413537 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.413536 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p" event={"ID":"3e2c6d12-1e18-498c-82e4-9c778e7c4aea","Type":"ContainerDied","Data":"c5a7bc59d4a487d07d1941d9077d09dbd7162fa2a7654271493cd94161b1e4b9"} Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.413579 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5a7bc59d4a487d07d1941d9077d09dbd7162fa2a7654271493cd94161b1e4b9" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.416606 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9hmw" event={"ID":"60a86faf-f4ad-4e5a-b614-4c90d228b05f","Type":"ContainerStarted","Data":"4f4480e98b75cb5cbdb27f8642ae40d8686d7f6b8b68ec3338fc34cd6da8161c"} Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.421121 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-89b4n" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.522837 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rrdv6"] Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.553364 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.789484 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.790881 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.799192 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.799388 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.808425 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.841812 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8b64c44-e9df-46d2-a1db-40677b4f370b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d8b64c44-e9df-46d2-a1db-40677b4f370b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.841854 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8b64c44-e9df-46d2-a1db-40677b4f370b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d8b64c44-e9df-46d2-a1db-40677b4f370b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.943087 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8b64c44-e9df-46d2-a1db-40677b4f370b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d8b64c44-e9df-46d2-a1db-40677b4f370b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.943160 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8b64c44-e9df-46d2-a1db-40677b4f370b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d8b64c44-e9df-46d2-a1db-40677b4f370b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.943343 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8b64c44-e9df-46d2-a1db-40677b4f370b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d8b64c44-e9df-46d2-a1db-40677b4f370b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:08:13 crc kubenswrapper[4885]: I1205 20:08:13.967584 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8b64c44-e9df-46d2-a1db-40677b4f370b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d8b64c44-e9df-46d2-a1db-40677b4f370b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:08:14 crc kubenswrapper[4885]: I1205 20:08:14.114551 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:08:14 crc kubenswrapper[4885]: I1205 20:08:14.428931 4885 generic.go:334] "Generic (PLEG): container finished" podID="f26a71ec-b73f-472e-9f1a-2baae67c0691" containerID="9b2fb0095c16a396610e3915982bdd5949190e7f9900efc280b37d152aba56b9" exitCode=0 Dec 05 20:08:14 crc kubenswrapper[4885]: I1205 20:08:14.429040 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrdv6" event={"ID":"f26a71ec-b73f-472e-9f1a-2baae67c0691","Type":"ContainerDied","Data":"9b2fb0095c16a396610e3915982bdd5949190e7f9900efc280b37d152aba56b9"} Dec 05 20:08:14 crc kubenswrapper[4885]: I1205 20:08:14.429078 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrdv6" event={"ID":"f26a71ec-b73f-472e-9f1a-2baae67c0691","Type":"ContainerStarted","Data":"de73363c062da1cf5ea99d3a72beab78f1309c22ceddf3e8ade9a8a7e08a0659"} Dec 05 20:08:14 crc kubenswrapper[4885]: I1205 20:08:14.436724 4885 generic.go:334] "Generic (PLEG): container finished" podID="60a86faf-f4ad-4e5a-b614-4c90d228b05f" containerID="d71c08fc7b39db1040860f1e8918187d7756db4c522fc57f9ed3d4cecca47a5e" exitCode=0 Dec 05 20:08:14 crc kubenswrapper[4885]: I1205 20:08:14.437336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9hmw" event={"ID":"60a86faf-f4ad-4e5a-b614-4c90d228b05f","Type":"ContainerDied","Data":"d71c08fc7b39db1040860f1e8918187d7756db4c522fc57f9ed3d4cecca47a5e"} Dec 05 20:08:14 crc kubenswrapper[4885]: I1205 20:08:14.450553 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41e47378-e96d-4343-81d4-3b936050774d","Type":"ContainerStarted","Data":"69fb2177f1d5dc79546b3128628a8ca7ff3397e4c6f7b19793aa8d83ab7ae7d1"} Dec 05 20:08:14 crc kubenswrapper[4885]: I1205 20:08:14.450641 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41e47378-e96d-4343-81d4-3b936050774d","Type":"ContainerStarted","Data":"4a94cc2fd224a14d5907c55d5c3126f5e2e9050f36279bf0c9b2b9ffa67e1b13"} Dec 05 20:08:14 crc kubenswrapper[4885]: I1205 20:08:14.631194 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 20:08:14 crc kubenswrapper[4885]: W1205 20:08:14.664584 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd8b64c44_e9df_46d2_a1db_40677b4f370b.slice/crio-a249ba2ce76f55b823fe5f18f723ea13f9fd676d25cd4bd32b250e808c9c1c55 WatchSource:0}: Error finding container a249ba2ce76f55b823fe5f18f723ea13f9fd676d25cd4bd32b250e808c9c1c55: Status 404 returned error can't find the container with id a249ba2ce76f55b823fe5f18f723ea13f9fd676d25cd4bd32b250e808c9c1c55 Dec 05 20:08:15 crc kubenswrapper[4885]: I1205 20:08:15.462317 4885 generic.go:334] "Generic (PLEG): container finished" podID="41e47378-e96d-4343-81d4-3b936050774d" containerID="69fb2177f1d5dc79546b3128628a8ca7ff3397e4c6f7b19793aa8d83ab7ae7d1" exitCode=0 Dec 05 20:08:15 crc kubenswrapper[4885]: I1205 20:08:15.462404 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41e47378-e96d-4343-81d4-3b936050774d","Type":"ContainerDied","Data":"69fb2177f1d5dc79546b3128628a8ca7ff3397e4c6f7b19793aa8d83ab7ae7d1"} Dec 05 20:08:15 crc kubenswrapper[4885]: I1205 20:08:15.475943 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d8b64c44-e9df-46d2-a1db-40677b4f370b","Type":"ContainerStarted","Data":"a508267bc3dce608aa1562627cd1bb8f7154a31c1862e00bb0b98bf7e2c7666d"} Dec 05 20:08:15 crc kubenswrapper[4885]: I1205 20:08:15.476003 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d8b64c44-e9df-46d2-a1db-40677b4f370b","Type":"ContainerStarted","Data":"a249ba2ce76f55b823fe5f18f723ea13f9fd676d25cd4bd32b250e808c9c1c55"} Dec 05 20:08:16 crc kubenswrapper[4885]: I1205 20:08:16.484169 4885 generic.go:334] "Generic (PLEG): container finished" podID="d8b64c44-e9df-46d2-a1db-40677b4f370b" containerID="a508267bc3dce608aa1562627cd1bb8f7154a31c1862e00bb0b98bf7e2c7666d" exitCode=0 Dec 05 20:08:16 crc kubenswrapper[4885]: I1205 20:08:16.484275 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d8b64c44-e9df-46d2-a1db-40677b4f370b","Type":"ContainerDied","Data":"a508267bc3dce608aa1562627cd1bb8f7154a31c1862e00bb0b98bf7e2c7666d"} Dec 05 20:08:16 crc kubenswrapper[4885]: I1205 20:08:16.630791 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:08:16 crc kubenswrapper[4885]: I1205 20:08:16.631088 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:08:16 crc kubenswrapper[4885]: I1205 20:08:16.771418 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:08:16 crc kubenswrapper[4885]: I1205 20:08:16.914260 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41e47378-e96d-4343-81d4-3b936050774d-kubelet-dir\") pod \"41e47378-e96d-4343-81d4-3b936050774d\" (UID: \"41e47378-e96d-4343-81d4-3b936050774d\") " Dec 05 20:08:16 crc kubenswrapper[4885]: I1205 20:08:16.914313 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41e47378-e96d-4343-81d4-3b936050774d-kube-api-access\") pod \"41e47378-e96d-4343-81d4-3b936050774d\" (UID: \"41e47378-e96d-4343-81d4-3b936050774d\") " Dec 05 20:08:16 crc kubenswrapper[4885]: I1205 20:08:16.914379 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e47378-e96d-4343-81d4-3b936050774d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "41e47378-e96d-4343-81d4-3b936050774d" (UID: "41e47378-e96d-4343-81d4-3b936050774d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:08:16 crc kubenswrapper[4885]: I1205 20:08:16.914512 4885 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41e47378-e96d-4343-81d4-3b936050774d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:08:16 crc kubenswrapper[4885]: I1205 20:08:16.920386 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41e47378-e96d-4343-81d4-3b936050774d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "41e47378-e96d-4343-81d4-3b936050774d" (UID: "41e47378-e96d-4343-81d4-3b936050774d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:08:17 crc kubenswrapper[4885]: I1205 20:08:17.015584 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41e47378-e96d-4343-81d4-3b936050774d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:08:17 crc kubenswrapper[4885]: I1205 20:08:17.487329 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mfhrt" Dec 05 20:08:17 crc kubenswrapper[4885]: I1205 20:08:17.501117 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:08:17 crc kubenswrapper[4885]: I1205 20:08:17.501414 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41e47378-e96d-4343-81d4-3b936050774d","Type":"ContainerDied","Data":"4a94cc2fd224a14d5907c55d5c3126f5e2e9050f36279bf0c9b2b9ffa67e1b13"} Dec 05 20:08:17 crc kubenswrapper[4885]: I1205 20:08:17.502350 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a94cc2fd224a14d5907c55d5c3126f5e2e9050f36279bf0c9b2b9ffa67e1b13" Dec 05 20:08:17 crc kubenswrapper[4885]: I1205 20:08:17.693211 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:08:17 crc kubenswrapper[4885]: I1205 20:08:17.824525 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8b64c44-e9df-46d2-a1db-40677b4f370b-kube-api-access\") pod \"d8b64c44-e9df-46d2-a1db-40677b4f370b\" (UID: \"d8b64c44-e9df-46d2-a1db-40677b4f370b\") " Dec 05 20:08:17 crc kubenswrapper[4885]: I1205 20:08:17.824912 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8b64c44-e9df-46d2-a1db-40677b4f370b-kubelet-dir\") pod \"d8b64c44-e9df-46d2-a1db-40677b4f370b\" (UID: \"d8b64c44-e9df-46d2-a1db-40677b4f370b\") " Dec 05 20:08:17 crc kubenswrapper[4885]: I1205 20:08:17.825193 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8b64c44-e9df-46d2-a1db-40677b4f370b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d8b64c44-e9df-46d2-a1db-40677b4f370b" (UID: "d8b64c44-e9df-46d2-a1db-40677b4f370b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:08:17 crc kubenswrapper[4885]: I1205 20:08:17.825257 4885 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8b64c44-e9df-46d2-a1db-40677b4f370b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:08:17 crc kubenswrapper[4885]: I1205 20:08:17.829097 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b64c44-e9df-46d2-a1db-40677b4f370b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d8b64c44-e9df-46d2-a1db-40677b4f370b" (UID: "d8b64c44-e9df-46d2-a1db-40677b4f370b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:08:17 crc kubenswrapper[4885]: I1205 20:08:17.925917 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8b64c44-e9df-46d2-a1db-40677b4f370b-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:08:18 crc kubenswrapper[4885]: I1205 20:08:18.509820 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d8b64c44-e9df-46d2-a1db-40677b4f370b","Type":"ContainerDied","Data":"a249ba2ce76f55b823fe5f18f723ea13f9fd676d25cd4bd32b250e808c9c1c55"} Dec 05 20:08:18 crc kubenswrapper[4885]: I1205 20:08:18.509858 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a249ba2ce76f55b823fe5f18f723ea13f9fd676d25cd4bd32b250e808c9c1c55" Dec 05 20:08:18 crc kubenswrapper[4885]: I1205 20:08:18.509895 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:08:19 crc kubenswrapper[4885]: I1205 20:08:19.751635 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs\") pod \"network-metrics-daemon-2jdj4\" (UID: \"a5c0a952-e24a-49c2-b4ba-e20be61b840d\") " pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:08:19 crc kubenswrapper[4885]: I1205 20:08:19.755627 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5c0a952-e24a-49c2-b4ba-e20be61b840d-metrics-certs\") pod \"network-metrics-daemon-2jdj4\" (UID: \"a5c0a952-e24a-49c2-b4ba-e20be61b840d\") " pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:08:20 crc kubenswrapper[4885]: I1205 20:08:20.004182 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2jdj4" Dec 05 20:08:21 crc kubenswrapper[4885]: I1205 20:08:21.290390 4885 patch_prober.go:28] interesting pod/console-f9d7485db-jdrlk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 05 20:08:21 crc kubenswrapper[4885]: I1205 20:08:21.290450 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-jdrlk" podUID="543415d6-6aec-42f4-953f-3a760aefe1f2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 05 20:08:29 crc kubenswrapper[4885]: I1205 20:08:29.755685 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:08:31 crc kubenswrapper[4885]: I1205 20:08:31.462899 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:08:31 crc kubenswrapper[4885]: I1205 20:08:31.465410 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:08:31 crc kubenswrapper[4885]: I1205 20:08:31.471545 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:08:40 crc kubenswrapper[4885]: E1205 20:08:40.062772 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 20:08:40 crc kubenswrapper[4885]: E1205 20:08:40.063459 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5qlr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-28gdb_openshift-marketplace(61f67e39-acf3-4ec4-af3f-68159973345e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:08:40 crc kubenswrapper[4885]: E1205 20:08:40.064708 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-28gdb" podUID="61f67e39-acf3-4ec4-af3f-68159973345e" Dec 05 20:08:40 crc kubenswrapper[4885]: E1205 20:08:40.488365 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 20:08:40 crc kubenswrapper[4885]: E1205 20:08:40.489151 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6724d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-pkhq9_openshift-marketplace(42603535-a30f-41b2-96e3-10f3f8144003): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:08:40 crc kubenswrapper[4885]: E1205 20:08:40.490439 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-pkhq9" podUID="42603535-a30f-41b2-96e3-10f3f8144003" Dec 05 20:08:40 crc kubenswrapper[4885]: E1205 20:08:40.969599 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 20:08:40 crc kubenswrapper[4885]: E1205 20:08:40.969817 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s4nl4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bl5hc_openshift-marketplace(3bca36c4-e503-4b3f-aaeb-829cebc24e4c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:08:40 crc kubenswrapper[4885]: E1205 20:08:40.971054 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bl5hc" podUID="3bca36c4-e503-4b3f-aaeb-829cebc24e4c" Dec 05 20:08:42 crc kubenswrapper[4885]: I1205 20:08:42.438206 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwpfl" Dec 05 20:08:42 crc kubenswrapper[4885]: E1205 20:08:42.498187 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-28gdb" podUID="61f67e39-acf3-4ec4-af3f-68159973345e" Dec 05 20:08:42 crc kubenswrapper[4885]: E1205 20:08:42.498287 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-pkhq9" podUID="42603535-a30f-41b2-96e3-10f3f8144003" Dec 05 20:08:42 crc kubenswrapper[4885]: E1205 20:08:42.498448 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bl5hc" podUID="3bca36c4-e503-4b3f-aaeb-829cebc24e4c" Dec 05 20:08:42 crc kubenswrapper[4885]: E1205 20:08:42.698373 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 20:08:42 crc kubenswrapper[4885]: E1205 20:08:42.698578 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9tq47,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-24244_openshift-marketplace(746679e1-b958-4320-bc6c-00060a83db3f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:08:42 crc kubenswrapper[4885]: E1205 20:08:42.699810 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-24244" podUID="746679e1-b958-4320-bc6c-00060a83db3f" Dec 05 20:08:42 crc kubenswrapper[4885]: E1205 20:08:42.754360 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 20:08:42 crc kubenswrapper[4885]: E1205 20:08:42.754516 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h987n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-76bdk_openshift-marketplace(ba098ab6-d9df-4d50-aaa6-085658e80871): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:08:42 crc kubenswrapper[4885]: E1205 20:08:42.755911 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-76bdk" podUID="ba098ab6-d9df-4d50-aaa6-085658e80871" Dec 05 20:08:44 crc kubenswrapper[4885]: I1205 20:08:44.769288 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 20:08:44 crc kubenswrapper[4885]: E1205 20:08:44.769807 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b64c44-e9df-46d2-a1db-40677b4f370b" containerName="pruner" Dec 05 20:08:44 crc kubenswrapper[4885]: I1205 20:08:44.769821 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b64c44-e9df-46d2-a1db-40677b4f370b" containerName="pruner" Dec 05 20:08:44 crc kubenswrapper[4885]: E1205 20:08:44.769841 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e47378-e96d-4343-81d4-3b936050774d" containerName="pruner" Dec 05 20:08:44 crc kubenswrapper[4885]: I1205 20:08:44.769849 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e47378-e96d-4343-81d4-3b936050774d" containerName="pruner" Dec 05 20:08:44 crc kubenswrapper[4885]: I1205 20:08:44.769963 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e47378-e96d-4343-81d4-3b936050774d" containerName="pruner" Dec 05 20:08:44 crc kubenswrapper[4885]: I1205 20:08:44.769981 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b64c44-e9df-46d2-a1db-40677b4f370b" containerName="pruner" Dec 05 20:08:44 crc kubenswrapper[4885]: I1205 20:08:44.770426 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:08:44 crc kubenswrapper[4885]: I1205 20:08:44.773331 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 20:08:44 crc kubenswrapper[4885]: I1205 20:08:44.773560 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 20:08:44 crc kubenswrapper[4885]: I1205 20:08:44.780832 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 20:08:44 crc kubenswrapper[4885]: I1205 20:08:44.875874 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a029dc62-7104-4d74-8201-cb8b16c79e02-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a029dc62-7104-4d74-8201-cb8b16c79e02\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:08:44 crc kubenswrapper[4885]: I1205 20:08:44.875960 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a029dc62-7104-4d74-8201-cb8b16c79e02-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a029dc62-7104-4d74-8201-cb8b16c79e02\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:08:44 crc kubenswrapper[4885]: I1205 20:08:44.977807 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a029dc62-7104-4d74-8201-cb8b16c79e02-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a029dc62-7104-4d74-8201-cb8b16c79e02\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:08:44 crc kubenswrapper[4885]: I1205 20:08:44.977862 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a029dc62-7104-4d74-8201-cb8b16c79e02-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a029dc62-7104-4d74-8201-cb8b16c79e02\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:08:44 crc kubenswrapper[4885]: I1205 20:08:44.977966 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a029dc62-7104-4d74-8201-cb8b16c79e02-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a029dc62-7104-4d74-8201-cb8b16c79e02\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:08:45 crc kubenswrapper[4885]: I1205 20:08:45.009952 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a029dc62-7104-4d74-8201-cb8b16c79e02-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a029dc62-7104-4d74-8201-cb8b16c79e02\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:08:45 crc kubenswrapper[4885]: I1205 20:08:45.098666 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:08:45 crc kubenswrapper[4885]: E1205 20:08:45.676196 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-24244" podUID="746679e1-b958-4320-bc6c-00060a83db3f" Dec 05 20:08:45 crc kubenswrapper[4885]: E1205 20:08:45.676636 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-76bdk" podUID="ba098ab6-d9df-4d50-aaa6-085658e80871" Dec 05 20:08:45 crc kubenswrapper[4885]: E1205 20:08:45.704777 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 20:08:45 crc kubenswrapper[4885]: E1205 20:08:45.704935 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t4qvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rrdv6_openshift-marketplace(f26a71ec-b73f-472e-9f1a-2baae67c0691): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:08:45 crc kubenswrapper[4885]: E1205 20:08:45.706127 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rrdv6" podUID="f26a71ec-b73f-472e-9f1a-2baae67c0691" Dec 05 20:08:45 crc kubenswrapper[4885]: I1205 20:08:45.984334 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2jdj4"] Dec 05 20:08:45 crc kubenswrapper[4885]: W1205 20:08:45.988117 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5c0a952_e24a_49c2_b4ba_e20be61b840d.slice/crio-6f0a5d35357d966e233763a3259c70171e4c7c31f9debe24273033cbd72d4877 WatchSource:0}: Error finding container 6f0a5d35357d966e233763a3259c70171e4c7c31f9debe24273033cbd72d4877: Status 404 returned error can't find the container with id 6f0a5d35357d966e233763a3259c70171e4c7c31f9debe24273033cbd72d4877 Dec 05 20:08:46 crc kubenswrapper[4885]: I1205 20:08:46.005701 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 20:08:46 crc kubenswrapper[4885]: W1205 20:08:46.019225 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda029dc62_7104_4d74_8201_cb8b16c79e02.slice/crio-8eac40f8f728d1e7f0f48d90952dc2253954e9417d7a0c24ccab17cade2d8921 WatchSource:0}: Error finding container 8eac40f8f728d1e7f0f48d90952dc2253954e9417d7a0c24ccab17cade2d8921: Status 404 returned error can't find the container with id 8eac40f8f728d1e7f0f48d90952dc2253954e9417d7a0c24ccab17cade2d8921 Dec 05 20:08:46 crc kubenswrapper[4885]: I1205 20:08:46.631215 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:08:46 crc kubenswrapper[4885]: I1205 20:08:46.631542 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:08:46 crc kubenswrapper[4885]: I1205 20:08:46.668487 4885 generic.go:334] "Generic (PLEG): container finished" podID="61122263-3d9d-4510-87bc-6e8ff3bf7af5" containerID="c331e665a7a095e20adb4e3021499753e7fc7fb06f5a89a70e4fee9c622dcde0" exitCode=0 Dec 05 20:08:46 crc kubenswrapper[4885]: I1205 20:08:46.668585 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g2dmb" event={"ID":"61122263-3d9d-4510-87bc-6e8ff3bf7af5","Type":"ContainerDied","Data":"c331e665a7a095e20adb4e3021499753e7fc7fb06f5a89a70e4fee9c622dcde0"} Dec 05 20:08:46 crc kubenswrapper[4885]: I1205 20:08:46.671032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" event={"ID":"a5c0a952-e24a-49c2-b4ba-e20be61b840d","Type":"ContainerStarted","Data":"666bf8b4b19d1de2dd8be47199bf8fe653ddeab49c6421d4faa74c6bc8213135"} Dec 05 20:08:46 crc kubenswrapper[4885]: I1205 20:08:46.671066 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" event={"ID":"a5c0a952-e24a-49c2-b4ba-e20be61b840d","Type":"ContainerStarted","Data":"617aae20fa1eab8744a99ad7d336b7b282e822c61f134072e8428036e8c5c7bf"} Dec 05 20:08:46 crc kubenswrapper[4885]: I1205 20:08:46.671077 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2jdj4" event={"ID":"a5c0a952-e24a-49c2-b4ba-e20be61b840d","Type":"ContainerStarted","Data":"6f0a5d35357d966e233763a3259c70171e4c7c31f9debe24273033cbd72d4877"} Dec 05 20:08:46 crc kubenswrapper[4885]: I1205 20:08:46.673348 4885 generic.go:334] "Generic (PLEG): container finished" podID="60a86faf-f4ad-4e5a-b614-4c90d228b05f" containerID="200b550d4c45d9e5a5409b99ac1a7953b120128e3baa8c05f6cf354052fabe35" exitCode=0 Dec 05 20:08:46 crc kubenswrapper[4885]: I1205 20:08:46.673395 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9hmw" event={"ID":"60a86faf-f4ad-4e5a-b614-4c90d228b05f","Type":"ContainerDied","Data":"200b550d4c45d9e5a5409b99ac1a7953b120128e3baa8c05f6cf354052fabe35"} Dec 05 20:08:46 crc kubenswrapper[4885]: I1205 20:08:46.678977 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a029dc62-7104-4d74-8201-cb8b16c79e02","Type":"ContainerStarted","Data":"917297c31356e8dd14d301b337ff91a330e411753dd71431837a3cebce931a01"} Dec 05 20:08:46 crc kubenswrapper[4885]: I1205 20:08:46.679064 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a029dc62-7104-4d74-8201-cb8b16c79e02","Type":"ContainerStarted","Data":"8eac40f8f728d1e7f0f48d90952dc2253954e9417d7a0c24ccab17cade2d8921"} Dec 05 20:08:46 crc kubenswrapper[4885]: E1205 20:08:46.681806 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rrdv6" podUID="f26a71ec-b73f-472e-9f1a-2baae67c0691" Dec 05 20:08:46 crc kubenswrapper[4885]: I1205 20:08:46.702558 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2jdj4" podStartSLOduration=169.702538962 podStartE2EDuration="2m49.702538962s" podCreationTimestamp="2025-12-05 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:46.698962634 +0000 UTC m=+191.995778295" watchObservedRunningTime="2025-12-05 20:08:46.702538962 +0000 UTC m=+191.999354623" Dec 05 20:08:46 crc kubenswrapper[4885]: I1205 20:08:46.753138 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.753118825 podStartE2EDuration="2.753118825s" podCreationTimestamp="2025-12-05 20:08:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:46.747204849 +0000 UTC m=+192.044020510" watchObservedRunningTime="2025-12-05 20:08:46.753118825 +0000 UTC m=+192.049934486" Dec 05 20:08:47 crc kubenswrapper[4885]: I1205 20:08:47.687710 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g2dmb" event={"ID":"61122263-3d9d-4510-87bc-6e8ff3bf7af5","Type":"ContainerStarted","Data":"43278e7cf72b3bded009173d27e3c91220d6255b10cd3f3a86db991ad544fe57"} Dec 05 20:08:47 crc kubenswrapper[4885]: I1205 20:08:47.691826 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9hmw" event={"ID":"60a86faf-f4ad-4e5a-b614-4c90d228b05f","Type":"ContainerStarted","Data":"df925bd36fa538dc6f19a012f9f9aa71e667ca946f5b1a0343d9d29ed41bad59"} Dec 05 20:08:47 crc kubenswrapper[4885]: I1205 20:08:47.693473 4885 generic.go:334] "Generic (PLEG): container finished" podID="a029dc62-7104-4d74-8201-cb8b16c79e02" containerID="917297c31356e8dd14d301b337ff91a330e411753dd71431837a3cebce931a01" exitCode=0 Dec 05 20:08:47 crc kubenswrapper[4885]: I1205 20:08:47.694120 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a029dc62-7104-4d74-8201-cb8b16c79e02","Type":"ContainerDied","Data":"917297c31356e8dd14d301b337ff91a330e411753dd71431837a3cebce931a01"} Dec 05 20:08:47 crc kubenswrapper[4885]: I1205 20:08:47.707250 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g2dmb" podStartSLOduration=1.995301489 podStartE2EDuration="36.707232677s" podCreationTimestamp="2025-12-05 20:08:11 +0000 UTC" firstStartedPulling="2025-12-05 20:08:12.392075765 +0000 UTC m=+157.688891416" lastFinishedPulling="2025-12-05 20:08:47.104006943 +0000 UTC m=+192.400822604" observedRunningTime="2025-12-05 20:08:47.704453595 +0000 UTC m=+193.001269256" watchObservedRunningTime="2025-12-05 20:08:47.707232677 +0000 UTC m=+193.004048338" Dec 05 20:08:47 crc kubenswrapper[4885]: I1205 20:08:47.723936 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n9hmw" podStartSLOduration=3.105263036 podStartE2EDuration="35.723918133s" podCreationTimestamp="2025-12-05 20:08:12 +0000 UTC" firstStartedPulling="2025-12-05 20:08:14.443685989 +0000 UTC m=+159.740501650" lastFinishedPulling="2025-12-05 20:08:47.062341086 +0000 UTC m=+192.359156747" observedRunningTime="2025-12-05 20:08:47.721681597 +0000 UTC m=+193.018497268" watchObservedRunningTime="2025-12-05 20:08:47.723918133 +0000 UTC m=+193.020733794" Dec 05 20:08:48 crc kubenswrapper[4885]: I1205 20:08:48.956950 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.032121 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a029dc62-7104-4d74-8201-cb8b16c79e02-kubelet-dir\") pod \"a029dc62-7104-4d74-8201-cb8b16c79e02\" (UID: \"a029dc62-7104-4d74-8201-cb8b16c79e02\") " Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.032254 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a029dc62-7104-4d74-8201-cb8b16c79e02-kube-api-access\") pod \"a029dc62-7104-4d74-8201-cb8b16c79e02\" (UID: \"a029dc62-7104-4d74-8201-cb8b16c79e02\") " Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.032259 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a029dc62-7104-4d74-8201-cb8b16c79e02-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a029dc62-7104-4d74-8201-cb8b16c79e02" (UID: "a029dc62-7104-4d74-8201-cb8b16c79e02"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.032544 4885 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a029dc62-7104-4d74-8201-cb8b16c79e02-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.044774 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a029dc62-7104-4d74-8201-cb8b16c79e02-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a029dc62-7104-4d74-8201-cb8b16c79e02" (UID: "a029dc62-7104-4d74-8201-cb8b16c79e02"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.133578 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a029dc62-7104-4d74-8201-cb8b16c79e02-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.707284 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a029dc62-7104-4d74-8201-cb8b16c79e02","Type":"ContainerDied","Data":"8eac40f8f728d1e7f0f48d90952dc2253954e9417d7a0c24ccab17cade2d8921"} Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.707665 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8eac40f8f728d1e7f0f48d90952dc2253954e9417d7a0c24ccab17cade2d8921" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.707743 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.778089 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 20:08:49 crc kubenswrapper[4885]: E1205 20:08:49.778482 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a029dc62-7104-4d74-8201-cb8b16c79e02" containerName="pruner" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.778508 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a029dc62-7104-4d74-8201-cb8b16c79e02" containerName="pruner" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.778664 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a029dc62-7104-4d74-8201-cb8b16c79e02" containerName="pruner" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.779273 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.781103 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.781113 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.784743 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.844043 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa54f954-f05a-44b2-8f26-4a9990d44845-kube-api-access\") pod \"installer-9-crc\" (UID: \"aa54f954-f05a-44b2-8f26-4a9990d44845\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.844145 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa54f954-f05a-44b2-8f26-4a9990d44845-kubelet-dir\") pod \"installer-9-crc\" (UID: \"aa54f954-f05a-44b2-8f26-4a9990d44845\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.844194 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aa54f954-f05a-44b2-8f26-4a9990d44845-var-lock\") pod \"installer-9-crc\" (UID: \"aa54f954-f05a-44b2-8f26-4a9990d44845\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.945714 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa54f954-f05a-44b2-8f26-4a9990d44845-kubelet-dir\") pod \"installer-9-crc\" (UID: \"aa54f954-f05a-44b2-8f26-4a9990d44845\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.945791 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aa54f954-f05a-44b2-8f26-4a9990d44845-var-lock\") pod \"installer-9-crc\" (UID: \"aa54f954-f05a-44b2-8f26-4a9990d44845\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.945854 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa54f954-f05a-44b2-8f26-4a9990d44845-kube-api-access\") pod \"installer-9-crc\" (UID: \"aa54f954-f05a-44b2-8f26-4a9990d44845\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.945889 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa54f954-f05a-44b2-8f26-4a9990d44845-kubelet-dir\") pod \"installer-9-crc\" (UID: \"aa54f954-f05a-44b2-8f26-4a9990d44845\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.945948 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aa54f954-f05a-44b2-8f26-4a9990d44845-var-lock\") pod \"installer-9-crc\" (UID: \"aa54f954-f05a-44b2-8f26-4a9990d44845\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:08:49 crc kubenswrapper[4885]: I1205 20:08:49.969545 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa54f954-f05a-44b2-8f26-4a9990d44845-kube-api-access\") pod \"installer-9-crc\" (UID: \"aa54f954-f05a-44b2-8f26-4a9990d44845\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:08:50 crc kubenswrapper[4885]: I1205 20:08:50.103175 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:08:50 crc kubenswrapper[4885]: I1205 20:08:50.549340 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 20:08:50 crc kubenswrapper[4885]: W1205 20:08:50.559403 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaa54f954_f05a_44b2_8f26_4a9990d44845.slice/crio-816d51aa00b2bc301886f30342114e72827a0b98b240d46a66d1c4887aade387 WatchSource:0}: Error finding container 816d51aa00b2bc301886f30342114e72827a0b98b240d46a66d1c4887aade387: Status 404 returned error can't find the container with id 816d51aa00b2bc301886f30342114e72827a0b98b240d46a66d1c4887aade387 Dec 05 20:08:50 crc kubenswrapper[4885]: I1205 20:08:50.711980 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"aa54f954-f05a-44b2-8f26-4a9990d44845","Type":"ContainerStarted","Data":"816d51aa00b2bc301886f30342114e72827a0b98b240d46a66d1c4887aade387"} Dec 05 20:08:51 crc kubenswrapper[4885]: I1205 20:08:51.428380 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:08:51 crc kubenswrapper[4885]: I1205 20:08:51.428856 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:08:51 crc kubenswrapper[4885]: I1205 20:08:51.515919 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:08:51 crc kubenswrapper[4885]: I1205 20:08:51.717884 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"aa54f954-f05a-44b2-8f26-4a9990d44845","Type":"ContainerStarted","Data":"f1c299c0d56a529caf2c9ce34b6fff3d0defa222c28bca3afea01cb17fbd6dfa"} Dec 05 20:08:52 crc kubenswrapper[4885]: I1205 20:08:52.608303 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:08:52 crc kubenswrapper[4885]: I1205 20:08:52.608387 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:08:53 crc kubenswrapper[4885]: I1205 20:08:53.654909 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n9hmw" podUID="60a86faf-f4ad-4e5a-b614-4c90d228b05f" containerName="registry-server" probeResult="failure" output=< Dec 05 20:08:53 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Dec 05 20:08:53 crc kubenswrapper[4885]: > Dec 05 20:08:54 crc kubenswrapper[4885]: I1205 20:08:54.195894 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=5.195871783 podStartE2EDuration="5.195871783s" podCreationTimestamp="2025-12-05 20:08:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:08:51.730037548 +0000 UTC m=+197.026853209" watchObservedRunningTime="2025-12-05 20:08:54.195871783 +0000 UTC m=+199.492687444" Dec 05 20:08:56 crc kubenswrapper[4885]: I1205 20:08:56.743704 4885 generic.go:334] "Generic (PLEG): container finished" podID="42603535-a30f-41b2-96e3-10f3f8144003" containerID="4f9fe0a38abee310af1e4da53ffb9d242ca6ddd876e0dcabb2aa07b9944334f5" exitCode=0 Dec 05 20:08:56 crc kubenswrapper[4885]: I1205 20:08:56.743784 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkhq9" event={"ID":"42603535-a30f-41b2-96e3-10f3f8144003","Type":"ContainerDied","Data":"4f9fe0a38abee310af1e4da53ffb9d242ca6ddd876e0dcabb2aa07b9944334f5"} Dec 05 20:08:57 crc kubenswrapper[4885]: I1205 20:08:57.750382 4885 generic.go:334] "Generic (PLEG): container finished" podID="61f67e39-acf3-4ec4-af3f-68159973345e" containerID="d236f500f04d3ebfcaf7b810d1d2049c27a683255308bc63be3dbf348f17e89f" exitCode=0 Dec 05 20:08:57 crc kubenswrapper[4885]: I1205 20:08:57.750468 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28gdb" event={"ID":"61f67e39-acf3-4ec4-af3f-68159973345e","Type":"ContainerDied","Data":"d236f500f04d3ebfcaf7b810d1d2049c27a683255308bc63be3dbf348f17e89f"} Dec 05 20:09:00 crc kubenswrapper[4885]: I1205 20:09:00.256358 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qcd9b"] Dec 05 20:09:00 crc kubenswrapper[4885]: I1205 20:09:00.773013 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bl5hc" event={"ID":"3bca36c4-e503-4b3f-aaeb-829cebc24e4c","Type":"ContainerStarted","Data":"cb8986e98fd8741710d138740215ab0ac36b482068b16baede1f08e030cb581d"} Dec 05 20:09:00 crc kubenswrapper[4885]: I1205 20:09:00.778166 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkhq9" event={"ID":"42603535-a30f-41b2-96e3-10f3f8144003","Type":"ContainerStarted","Data":"2a461154d342cfe1d497d541ad29e1e3cd442304d705d097c17d49a90ffc14ce"} Dec 05 20:09:00 crc kubenswrapper[4885]: I1205 20:09:00.814157 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pkhq9" podStartSLOduration=1.892471918 podStartE2EDuration="49.814133465s" podCreationTimestamp="2025-12-05 20:08:11 +0000 UTC" firstStartedPulling="2025-12-05 20:08:12.401662224 +0000 UTC m=+157.698477885" lastFinishedPulling="2025-12-05 20:09:00.323323771 +0000 UTC m=+205.620139432" observedRunningTime="2025-12-05 20:09:00.814106614 +0000 UTC m=+206.110922275" watchObservedRunningTime="2025-12-05 20:09:00.814133465 +0000 UTC m=+206.110949146" Dec 05 20:09:01 crc kubenswrapper[4885]: I1205 20:09:01.491490 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:09:01 crc kubenswrapper[4885]: I1205 20:09:01.784311 4885 generic.go:334] "Generic (PLEG): container finished" podID="ba098ab6-d9df-4d50-aaa6-085658e80871" containerID="0bfa6d778d305b19ebf46ab810a977eebe77c7687017fa9c2530664ef8554bd3" exitCode=0 Dec 05 20:09:01 crc kubenswrapper[4885]: I1205 20:09:01.784393 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76bdk" event={"ID":"ba098ab6-d9df-4d50-aaa6-085658e80871","Type":"ContainerDied","Data":"0bfa6d778d305b19ebf46ab810a977eebe77c7687017fa9c2530664ef8554bd3"} Dec 05 20:09:01 crc kubenswrapper[4885]: I1205 20:09:01.788435 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28gdb" event={"ID":"61f67e39-acf3-4ec4-af3f-68159973345e","Type":"ContainerStarted","Data":"bfc8f52385ce855919cfb4fb6c9b1f526b1f4c727797bec0c95fb7548b8a87ec"} Dec 05 20:09:01 crc kubenswrapper[4885]: I1205 20:09:01.791729 4885 generic.go:334] "Generic (PLEG): container finished" podID="f26a71ec-b73f-472e-9f1a-2baae67c0691" containerID="4212a66d5d5f9213143c14cf7747f767d47ca182bda948cf7becb153b707f67f" exitCode=0 Dec 05 20:09:01 crc kubenswrapper[4885]: I1205 20:09:01.791815 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrdv6" event={"ID":"f26a71ec-b73f-472e-9f1a-2baae67c0691","Type":"ContainerDied","Data":"4212a66d5d5f9213143c14cf7747f767d47ca182bda948cf7becb153b707f67f"} Dec 05 20:09:01 crc kubenswrapper[4885]: I1205 20:09:01.796330 4885 generic.go:334] "Generic (PLEG): container finished" podID="3bca36c4-e503-4b3f-aaeb-829cebc24e4c" containerID="cb8986e98fd8741710d138740215ab0ac36b482068b16baede1f08e030cb581d" exitCode=0 Dec 05 20:09:01 crc kubenswrapper[4885]: I1205 20:09:01.796361 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bl5hc" event={"ID":"3bca36c4-e503-4b3f-aaeb-829cebc24e4c","Type":"ContainerDied","Data":"cb8986e98fd8741710d138740215ab0ac36b482068b16baede1f08e030cb581d"} Dec 05 20:09:01 crc kubenswrapper[4885]: I1205 20:09:01.840963 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-28gdb" podStartSLOduration=2.815286038 podStartE2EDuration="52.840947197s" podCreationTimestamp="2025-12-05 20:08:09 +0000 UTC" firstStartedPulling="2025-12-05 20:08:10.33711532 +0000 UTC m=+155.633930981" lastFinishedPulling="2025-12-05 20:09:00.362776479 +0000 UTC m=+205.659592140" observedRunningTime="2025-12-05 20:09:01.838782126 +0000 UTC m=+207.135597797" watchObservedRunningTime="2025-12-05 20:09:01.840947197 +0000 UTC m=+207.137762858" Dec 05 20:09:01 crc kubenswrapper[4885]: I1205 20:09:01.875873 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:09:01 crc kubenswrapper[4885]: I1205 20:09:01.875923 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:09:02 crc kubenswrapper[4885]: I1205 20:09:02.652404 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:09:02 crc kubenswrapper[4885]: I1205 20:09:02.710067 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:09:02 crc kubenswrapper[4885]: I1205 20:09:02.918895 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-pkhq9" podUID="42603535-a30f-41b2-96e3-10f3f8144003" containerName="registry-server" probeResult="failure" output=< Dec 05 20:09:02 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Dec 05 20:09:02 crc kubenswrapper[4885]: > Dec 05 20:09:03 crc kubenswrapper[4885]: I1205 20:09:03.814493 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76bdk" event={"ID":"ba098ab6-d9df-4d50-aaa6-085658e80871","Type":"ContainerStarted","Data":"53efc13e8146729bc0e9cadb50f1a23e7f390122ec8d5d85eecdaa41800387b8"} Dec 05 20:09:03 crc kubenswrapper[4885]: I1205 20:09:03.816103 4885 generic.go:334] "Generic (PLEG): container finished" podID="746679e1-b958-4320-bc6c-00060a83db3f" containerID="0c22501379994176a2e773e608d6442d78718e254ec8d0ce1fb34e2aec652438" exitCode=0 Dec 05 20:09:03 crc kubenswrapper[4885]: I1205 20:09:03.816128 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24244" event={"ID":"746679e1-b958-4320-bc6c-00060a83db3f","Type":"ContainerDied","Data":"0c22501379994176a2e773e608d6442d78718e254ec8d0ce1fb34e2aec652438"} Dec 05 20:09:03 crc kubenswrapper[4885]: I1205 20:09:03.818497 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrdv6" event={"ID":"f26a71ec-b73f-472e-9f1a-2baae67c0691","Type":"ContainerStarted","Data":"9647a610d3a466e3140679f4c5ff0805df326a39296d3879bd2fa07fcee3188b"} Dec 05 20:09:03 crc kubenswrapper[4885]: I1205 20:09:03.821674 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bl5hc" event={"ID":"3bca36c4-e503-4b3f-aaeb-829cebc24e4c","Type":"ContainerStarted","Data":"a2941432578dce5760af719824b9b7c72902bfcb3559886ed9b809a69be2206f"} Dec 05 20:09:03 crc kubenswrapper[4885]: I1205 20:09:03.986815 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-76bdk" podStartSLOduration=2.38422665 podStartE2EDuration="54.986796857s" podCreationTimestamp="2025-12-05 20:08:09 +0000 UTC" firstStartedPulling="2025-12-05 20:08:10.334986949 +0000 UTC m=+155.631802610" lastFinishedPulling="2025-12-05 20:09:02.937557156 +0000 UTC m=+208.234372817" observedRunningTime="2025-12-05 20:09:03.984813701 +0000 UTC m=+209.281629382" watchObservedRunningTime="2025-12-05 20:09:03.986796857 +0000 UTC m=+209.283612518" Dec 05 20:09:04 crc kubenswrapper[4885]: I1205 20:09:04.017910 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bl5hc" podStartSLOduration=3.550100381 podStartE2EDuration="55.017890339s" podCreationTimestamp="2025-12-05 20:08:09 +0000 UTC" firstStartedPulling="2025-12-05 20:08:11.355852681 +0000 UTC m=+156.652668342" lastFinishedPulling="2025-12-05 20:09:02.823642639 +0000 UTC m=+208.120458300" observedRunningTime="2025-12-05 20:09:04.015843022 +0000 UTC m=+209.312658693" watchObservedRunningTime="2025-12-05 20:09:04.017890339 +0000 UTC m=+209.314706000" Dec 05 20:09:04 crc kubenswrapper[4885]: I1205 20:09:04.829004 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24244" event={"ID":"746679e1-b958-4320-bc6c-00060a83db3f","Type":"ContainerStarted","Data":"61990de48711f3bac0a406b6c1f3c6f0caeeb6c924675330235b61763540d3f7"} Dec 05 20:09:04 crc kubenswrapper[4885]: I1205 20:09:04.854691 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rrdv6" podStartSLOduration=4.382200054 podStartE2EDuration="52.854675172s" podCreationTimestamp="2025-12-05 20:08:12 +0000 UTC" firstStartedPulling="2025-12-05 20:08:14.431725941 +0000 UTC m=+159.728541602" lastFinishedPulling="2025-12-05 20:09:02.904201059 +0000 UTC m=+208.201016720" observedRunningTime="2025-12-05 20:09:04.056597523 +0000 UTC m=+209.353413184" watchObservedRunningTime="2025-12-05 20:09:04.854675172 +0000 UTC m=+210.151490833" Dec 05 20:09:04 crc kubenswrapper[4885]: I1205 20:09:04.855207 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-24244" podStartSLOduration=2.019445254 podStartE2EDuration="55.855203589s" podCreationTimestamp="2025-12-05 20:08:09 +0000 UTC" firstStartedPulling="2025-12-05 20:08:10.3422411 +0000 UTC m=+155.639056761" lastFinishedPulling="2025-12-05 20:09:04.177999435 +0000 UTC m=+209.474815096" observedRunningTime="2025-12-05 20:09:04.85096992 +0000 UTC m=+210.147785581" watchObservedRunningTime="2025-12-05 20:09:04.855203589 +0000 UTC m=+210.152019250" Dec 05 20:09:09 crc kubenswrapper[4885]: I1205 20:09:09.404351 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-24244" Dec 05 20:09:09 crc kubenswrapper[4885]: I1205 20:09:09.404902 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-24244" Dec 05 20:09:09 crc kubenswrapper[4885]: I1205 20:09:09.455856 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-24244" Dec 05 20:09:09 crc kubenswrapper[4885]: I1205 20:09:09.609683 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:09:09 crc kubenswrapper[4885]: I1205 20:09:09.609727 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:09:09 crc kubenswrapper[4885]: I1205 20:09:09.646484 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:09:09 crc kubenswrapper[4885]: I1205 20:09:09.842751 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:09:09 crc kubenswrapper[4885]: I1205 20:09:09.843066 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:09:09 crc kubenswrapper[4885]: I1205 20:09:09.890515 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:09:09 crc kubenswrapper[4885]: I1205 20:09:09.900629 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:09:09 crc kubenswrapper[4885]: I1205 20:09:09.909768 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-24244" Dec 05 20:09:09 crc kubenswrapper[4885]: I1205 20:09:09.935703 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:09:10 crc kubenswrapper[4885]: I1205 20:09:10.044887 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:09:10 crc kubenswrapper[4885]: I1205 20:09:10.044950 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:09:10 crc kubenswrapper[4885]: I1205 20:09:10.093448 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:09:10 crc kubenswrapper[4885]: I1205 20:09:10.919591 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:09:11 crc kubenswrapper[4885]: I1205 20:09:11.922078 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:09:11 crc kubenswrapper[4885]: I1205 20:09:11.973624 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bl5hc"] Dec 05 20:09:11 crc kubenswrapper[4885]: I1205 20:09:11.983352 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:09:12 crc kubenswrapper[4885]: I1205 20:09:12.177825 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-76bdk"] Dec 05 20:09:12 crc kubenswrapper[4885]: I1205 20:09:12.178340 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-76bdk" podUID="ba098ab6-d9df-4d50-aaa6-085658e80871" containerName="registry-server" containerID="cri-o://53efc13e8146729bc0e9cadb50f1a23e7f390122ec8d5d85eecdaa41800387b8" gracePeriod=2 Dec 05 20:09:12 crc kubenswrapper[4885]: I1205 20:09:12.875888 4885 generic.go:334] "Generic (PLEG): container finished" podID="ba098ab6-d9df-4d50-aaa6-085658e80871" containerID="53efc13e8146729bc0e9cadb50f1a23e7f390122ec8d5d85eecdaa41800387b8" exitCode=0 Dec 05 20:09:12 crc kubenswrapper[4885]: I1205 20:09:12.875979 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76bdk" event={"ID":"ba098ab6-d9df-4d50-aaa6-085658e80871","Type":"ContainerDied","Data":"53efc13e8146729bc0e9cadb50f1a23e7f390122ec8d5d85eecdaa41800387b8"} Dec 05 20:09:12 crc kubenswrapper[4885]: I1205 20:09:12.876554 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bl5hc" podUID="3bca36c4-e503-4b3f-aaeb-829cebc24e4c" containerName="registry-server" containerID="cri-o://a2941432578dce5760af719824b9b7c72902bfcb3559886ed9b809a69be2206f" gracePeriod=2 Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.052333 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.052404 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.067442 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.138750 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.152298 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h987n\" (UniqueName: \"kubernetes.io/projected/ba098ab6-d9df-4d50-aaa6-085658e80871-kube-api-access-h987n\") pod \"ba098ab6-d9df-4d50-aaa6-085658e80871\" (UID: \"ba098ab6-d9df-4d50-aaa6-085658e80871\") " Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.152438 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba098ab6-d9df-4d50-aaa6-085658e80871-catalog-content\") pod \"ba098ab6-d9df-4d50-aaa6-085658e80871\" (UID: \"ba098ab6-d9df-4d50-aaa6-085658e80871\") " Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.152481 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba098ab6-d9df-4d50-aaa6-085658e80871-utilities\") pod \"ba098ab6-d9df-4d50-aaa6-085658e80871\" (UID: \"ba098ab6-d9df-4d50-aaa6-085658e80871\") " Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.157206 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba098ab6-d9df-4d50-aaa6-085658e80871-utilities" (OuterVolumeSpecName: "utilities") pod "ba098ab6-d9df-4d50-aaa6-085658e80871" (UID: "ba098ab6-d9df-4d50-aaa6-085658e80871"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.161733 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba098ab6-d9df-4d50-aaa6-085658e80871-kube-api-access-h987n" (OuterVolumeSpecName: "kube-api-access-h987n") pod "ba098ab6-d9df-4d50-aaa6-085658e80871" (UID: "ba098ab6-d9df-4d50-aaa6-085658e80871"). InnerVolumeSpecName "kube-api-access-h987n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.209187 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba098ab6-d9df-4d50-aaa6-085658e80871-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba098ab6-d9df-4d50-aaa6-085658e80871" (UID: "ba098ab6-d9df-4d50-aaa6-085658e80871"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.212561 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.256042 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h987n\" (UniqueName: \"kubernetes.io/projected/ba098ab6-d9df-4d50-aaa6-085658e80871-kube-api-access-h987n\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.256104 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba098ab6-d9df-4d50-aaa6-085658e80871-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.256123 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba098ab6-d9df-4d50-aaa6-085658e80871-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.357219 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-utilities\") pod \"3bca36c4-e503-4b3f-aaeb-829cebc24e4c\" (UID: \"3bca36c4-e503-4b3f-aaeb-829cebc24e4c\") " Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.357351 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-catalog-content\") pod \"3bca36c4-e503-4b3f-aaeb-829cebc24e4c\" (UID: \"3bca36c4-e503-4b3f-aaeb-829cebc24e4c\") " Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.357428 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4nl4\" (UniqueName: \"kubernetes.io/projected/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-kube-api-access-s4nl4\") pod \"3bca36c4-e503-4b3f-aaeb-829cebc24e4c\" (UID: \"3bca36c4-e503-4b3f-aaeb-829cebc24e4c\") " Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.358762 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-utilities" (OuterVolumeSpecName: "utilities") pod "3bca36c4-e503-4b3f-aaeb-829cebc24e4c" (UID: "3bca36c4-e503-4b3f-aaeb-829cebc24e4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.362116 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-kube-api-access-s4nl4" (OuterVolumeSpecName: "kube-api-access-s4nl4") pod "3bca36c4-e503-4b3f-aaeb-829cebc24e4c" (UID: "3bca36c4-e503-4b3f-aaeb-829cebc24e4c"). InnerVolumeSpecName "kube-api-access-s4nl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.418785 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bca36c4-e503-4b3f-aaeb-829cebc24e4c" (UID: "3bca36c4-e503-4b3f-aaeb-829cebc24e4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.458918 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.458967 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.458987 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4nl4\" (UniqueName: \"kubernetes.io/projected/3bca36c4-e503-4b3f-aaeb-829cebc24e4c-kube-api-access-s4nl4\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.888883 4885 generic.go:334] "Generic (PLEG): container finished" podID="3bca36c4-e503-4b3f-aaeb-829cebc24e4c" containerID="a2941432578dce5760af719824b9b7c72902bfcb3559886ed9b809a69be2206f" exitCode=0 Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.889011 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bl5hc" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.889016 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bl5hc" event={"ID":"3bca36c4-e503-4b3f-aaeb-829cebc24e4c","Type":"ContainerDied","Data":"a2941432578dce5760af719824b9b7c72902bfcb3559886ed9b809a69be2206f"} Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.889127 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bl5hc" event={"ID":"3bca36c4-e503-4b3f-aaeb-829cebc24e4c","Type":"ContainerDied","Data":"60b51facfd94710ee72d5fa43ac7d310fc0be42eddd26c01b0449b49ac982e8f"} Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.889160 4885 scope.go:117] "RemoveContainer" containerID="a2941432578dce5760af719824b9b7c72902bfcb3559886ed9b809a69be2206f" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.893839 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76bdk" event={"ID":"ba098ab6-d9df-4d50-aaa6-085658e80871","Type":"ContainerDied","Data":"e32a3e71bd1c4103f742a621b7874f7a71266de42f95797ed45dccbc197c6c21"} Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.895340 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76bdk" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.913645 4885 scope.go:117] "RemoveContainer" containerID="cb8986e98fd8741710d138740215ab0ac36b482068b16baede1f08e030cb581d" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.933104 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bl5hc"] Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.937985 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bl5hc"] Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.951061 4885 scope.go:117] "RemoveContainer" containerID="5af2c067aac8a771afac56b405d377d9832177a0728b1d5de84a72637ab0e60a" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.977566 4885 scope.go:117] "RemoveContainer" containerID="a2941432578dce5760af719824b9b7c72902bfcb3559886ed9b809a69be2206f" Dec 05 20:09:13 crc kubenswrapper[4885]: E1205 20:09:13.978574 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2941432578dce5760af719824b9b7c72902bfcb3559886ed9b809a69be2206f\": container with ID starting with a2941432578dce5760af719824b9b7c72902bfcb3559886ed9b809a69be2206f not found: ID does not exist" containerID="a2941432578dce5760af719824b9b7c72902bfcb3559886ed9b809a69be2206f" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.978648 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2941432578dce5760af719824b9b7c72902bfcb3559886ed9b809a69be2206f"} err="failed to get container status \"a2941432578dce5760af719824b9b7c72902bfcb3559886ed9b809a69be2206f\": rpc error: code = NotFound desc = could not find container \"a2941432578dce5760af719824b9b7c72902bfcb3559886ed9b809a69be2206f\": container with ID starting with a2941432578dce5760af719824b9b7c72902bfcb3559886ed9b809a69be2206f not found: ID does not exist" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.978723 4885 scope.go:117] "RemoveContainer" containerID="cb8986e98fd8741710d138740215ab0ac36b482068b16baede1f08e030cb581d" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.979543 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-76bdk"] Dec 05 20:09:13 crc kubenswrapper[4885]: E1205 20:09:13.981223 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb8986e98fd8741710d138740215ab0ac36b482068b16baede1f08e030cb581d\": container with ID starting with cb8986e98fd8741710d138740215ab0ac36b482068b16baede1f08e030cb581d not found: ID does not exist" containerID="cb8986e98fd8741710d138740215ab0ac36b482068b16baede1f08e030cb581d" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.981261 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb8986e98fd8741710d138740215ab0ac36b482068b16baede1f08e030cb581d"} err="failed to get container status \"cb8986e98fd8741710d138740215ab0ac36b482068b16baede1f08e030cb581d\": rpc error: code = NotFound desc = could not find container \"cb8986e98fd8741710d138740215ab0ac36b482068b16baede1f08e030cb581d\": container with ID starting with cb8986e98fd8741710d138740215ab0ac36b482068b16baede1f08e030cb581d not found: ID does not exist" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.981290 4885 scope.go:117] "RemoveContainer" containerID="5af2c067aac8a771afac56b405d377d9832177a0728b1d5de84a72637ab0e60a" Dec 05 20:09:13 crc kubenswrapper[4885]: E1205 20:09:13.981678 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5af2c067aac8a771afac56b405d377d9832177a0728b1d5de84a72637ab0e60a\": container with ID starting with 5af2c067aac8a771afac56b405d377d9832177a0728b1d5de84a72637ab0e60a not found: ID does not exist" containerID="5af2c067aac8a771afac56b405d377d9832177a0728b1d5de84a72637ab0e60a" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.981756 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5af2c067aac8a771afac56b405d377d9832177a0728b1d5de84a72637ab0e60a"} err="failed to get container status \"5af2c067aac8a771afac56b405d377d9832177a0728b1d5de84a72637ab0e60a\": rpc error: code = NotFound desc = could not find container \"5af2c067aac8a771afac56b405d377d9832177a0728b1d5de84a72637ab0e60a\": container with ID starting with 5af2c067aac8a771afac56b405d377d9832177a0728b1d5de84a72637ab0e60a not found: ID does not exist" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.981815 4885 scope.go:117] "RemoveContainer" containerID="53efc13e8146729bc0e9cadb50f1a23e7f390122ec8d5d85eecdaa41800387b8" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.982363 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:09:13 crc kubenswrapper[4885]: I1205 20:09:13.986140 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-76bdk"] Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.003974 4885 scope.go:117] "RemoveContainer" containerID="0bfa6d778d305b19ebf46ab810a977eebe77c7687017fa9c2530664ef8554bd3" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.024461 4885 scope.go:117] "RemoveContainer" containerID="7839561cfb0c4f78ca86ff632075444f2afebc20c60f20aeb845dd09bb97f091" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.370989 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pkhq9"] Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.371236 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pkhq9" podUID="42603535-a30f-41b2-96e3-10f3f8144003" containerName="registry-server" containerID="cri-o://2a461154d342cfe1d497d541ad29e1e3cd442304d705d097c17d49a90ffc14ce" gracePeriod=2 Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.723228 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.776865 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6724d\" (UniqueName: \"kubernetes.io/projected/42603535-a30f-41b2-96e3-10f3f8144003-kube-api-access-6724d\") pod \"42603535-a30f-41b2-96e3-10f3f8144003\" (UID: \"42603535-a30f-41b2-96e3-10f3f8144003\") " Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.777074 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42603535-a30f-41b2-96e3-10f3f8144003-catalog-content\") pod \"42603535-a30f-41b2-96e3-10f3f8144003\" (UID: \"42603535-a30f-41b2-96e3-10f3f8144003\") " Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.777143 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42603535-a30f-41b2-96e3-10f3f8144003-utilities\") pod \"42603535-a30f-41b2-96e3-10f3f8144003\" (UID: \"42603535-a30f-41b2-96e3-10f3f8144003\") " Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.777982 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42603535-a30f-41b2-96e3-10f3f8144003-utilities" (OuterVolumeSpecName: "utilities") pod "42603535-a30f-41b2-96e3-10f3f8144003" (UID: "42603535-a30f-41b2-96e3-10f3f8144003"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.780624 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42603535-a30f-41b2-96e3-10f3f8144003-kube-api-access-6724d" (OuterVolumeSpecName: "kube-api-access-6724d") pod "42603535-a30f-41b2-96e3-10f3f8144003" (UID: "42603535-a30f-41b2-96e3-10f3f8144003"). InnerVolumeSpecName "kube-api-access-6724d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.794977 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42603535-a30f-41b2-96e3-10f3f8144003-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42603535-a30f-41b2-96e3-10f3f8144003" (UID: "42603535-a30f-41b2-96e3-10f3f8144003"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.878615 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42603535-a30f-41b2-96e3-10f3f8144003-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.878650 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42603535-a30f-41b2-96e3-10f3f8144003-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.878660 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6724d\" (UniqueName: \"kubernetes.io/projected/42603535-a30f-41b2-96e3-10f3f8144003-kube-api-access-6724d\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.902913 4885 generic.go:334] "Generic (PLEG): container finished" podID="42603535-a30f-41b2-96e3-10f3f8144003" containerID="2a461154d342cfe1d497d541ad29e1e3cd442304d705d097c17d49a90ffc14ce" exitCode=0 Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.902965 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkhq9" event={"ID":"42603535-a30f-41b2-96e3-10f3f8144003","Type":"ContainerDied","Data":"2a461154d342cfe1d497d541ad29e1e3cd442304d705d097c17d49a90ffc14ce"} Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.903078 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkhq9" event={"ID":"42603535-a30f-41b2-96e3-10f3f8144003","Type":"ContainerDied","Data":"3e986a6fcb277c5260db378a5592b785e1d26f12b1936a4ab5613586e3701aa8"} Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.903113 4885 scope.go:117] "RemoveContainer" containerID="2a461154d342cfe1d497d541ad29e1e3cd442304d705d097c17d49a90ffc14ce" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.902989 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pkhq9" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.918259 4885 scope.go:117] "RemoveContainer" containerID="4f9fe0a38abee310af1e4da53ffb9d242ca6ddd876e0dcabb2aa07b9944334f5" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.927110 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pkhq9"] Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.929791 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pkhq9"] Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.938954 4885 scope.go:117] "RemoveContainer" containerID="edbd2b3f4a34ccb59ced1721b16410330d18855193863b8448984fd184c9494e" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.958898 4885 scope.go:117] "RemoveContainer" containerID="2a461154d342cfe1d497d541ad29e1e3cd442304d705d097c17d49a90ffc14ce" Dec 05 20:09:14 crc kubenswrapper[4885]: E1205 20:09:14.959501 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a461154d342cfe1d497d541ad29e1e3cd442304d705d097c17d49a90ffc14ce\": container with ID starting with 2a461154d342cfe1d497d541ad29e1e3cd442304d705d097c17d49a90ffc14ce not found: ID does not exist" containerID="2a461154d342cfe1d497d541ad29e1e3cd442304d705d097c17d49a90ffc14ce" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.959530 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a461154d342cfe1d497d541ad29e1e3cd442304d705d097c17d49a90ffc14ce"} err="failed to get container status \"2a461154d342cfe1d497d541ad29e1e3cd442304d705d097c17d49a90ffc14ce\": rpc error: code = NotFound desc = could not find container \"2a461154d342cfe1d497d541ad29e1e3cd442304d705d097c17d49a90ffc14ce\": container with ID starting with 2a461154d342cfe1d497d541ad29e1e3cd442304d705d097c17d49a90ffc14ce not found: ID does not exist" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.959559 4885 scope.go:117] "RemoveContainer" containerID="4f9fe0a38abee310af1e4da53ffb9d242ca6ddd876e0dcabb2aa07b9944334f5" Dec 05 20:09:14 crc kubenswrapper[4885]: E1205 20:09:14.960106 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f9fe0a38abee310af1e4da53ffb9d242ca6ddd876e0dcabb2aa07b9944334f5\": container with ID starting with 4f9fe0a38abee310af1e4da53ffb9d242ca6ddd876e0dcabb2aa07b9944334f5 not found: ID does not exist" containerID="4f9fe0a38abee310af1e4da53ffb9d242ca6ddd876e0dcabb2aa07b9944334f5" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.960188 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f9fe0a38abee310af1e4da53ffb9d242ca6ddd876e0dcabb2aa07b9944334f5"} err="failed to get container status \"4f9fe0a38abee310af1e4da53ffb9d242ca6ddd876e0dcabb2aa07b9944334f5\": rpc error: code = NotFound desc = could not find container \"4f9fe0a38abee310af1e4da53ffb9d242ca6ddd876e0dcabb2aa07b9944334f5\": container with ID starting with 4f9fe0a38abee310af1e4da53ffb9d242ca6ddd876e0dcabb2aa07b9944334f5 not found: ID does not exist" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.960234 4885 scope.go:117] "RemoveContainer" containerID="edbd2b3f4a34ccb59ced1721b16410330d18855193863b8448984fd184c9494e" Dec 05 20:09:14 crc kubenswrapper[4885]: E1205 20:09:14.960672 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edbd2b3f4a34ccb59ced1721b16410330d18855193863b8448984fd184c9494e\": container with ID starting with edbd2b3f4a34ccb59ced1721b16410330d18855193863b8448984fd184c9494e not found: ID does not exist" containerID="edbd2b3f4a34ccb59ced1721b16410330d18855193863b8448984fd184c9494e" Dec 05 20:09:14 crc kubenswrapper[4885]: I1205 20:09:14.960717 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbd2b3f4a34ccb59ced1721b16410330d18855193863b8448984fd184c9494e"} err="failed to get container status \"edbd2b3f4a34ccb59ced1721b16410330d18855193863b8448984fd184c9494e\": rpc error: code = NotFound desc = could not find container \"edbd2b3f4a34ccb59ced1721b16410330d18855193863b8448984fd184c9494e\": container with ID starting with edbd2b3f4a34ccb59ced1721b16410330d18855193863b8448984fd184c9494e not found: ID does not exist" Dec 05 20:09:15 crc kubenswrapper[4885]: I1205 20:09:15.185316 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bca36c4-e503-4b3f-aaeb-829cebc24e4c" path="/var/lib/kubelet/pods/3bca36c4-e503-4b3f-aaeb-829cebc24e4c/volumes" Dec 05 20:09:15 crc kubenswrapper[4885]: I1205 20:09:15.186090 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42603535-a30f-41b2-96e3-10f3f8144003" path="/var/lib/kubelet/pods/42603535-a30f-41b2-96e3-10f3f8144003/volumes" Dec 05 20:09:15 crc kubenswrapper[4885]: I1205 20:09:15.186849 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba098ab6-d9df-4d50-aaa6-085658e80871" path="/var/lib/kubelet/pods/ba098ab6-d9df-4d50-aaa6-085658e80871/volumes" Dec 05 20:09:16 crc kubenswrapper[4885]: I1205 20:09:16.631359 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:09:16 crc kubenswrapper[4885]: I1205 20:09:16.631901 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:09:16 crc kubenswrapper[4885]: I1205 20:09:16.631984 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:09:16 crc kubenswrapper[4885]: I1205 20:09:16.632982 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda"} pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:09:16 crc kubenswrapper[4885]: I1205 20:09:16.633136 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" containerID="cri-o://ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda" gracePeriod=600 Dec 05 20:09:16 crc kubenswrapper[4885]: I1205 20:09:16.770196 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rrdv6"] Dec 05 20:09:16 crc kubenswrapper[4885]: I1205 20:09:16.770456 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rrdv6" podUID="f26a71ec-b73f-472e-9f1a-2baae67c0691" containerName="registry-server" containerID="cri-o://9647a610d3a466e3140679f4c5ff0805df326a39296d3879bd2fa07fcee3188b" gracePeriod=2 Dec 05 20:09:17 crc kubenswrapper[4885]: I1205 20:09:17.922599 4885 generic.go:334] "Generic (PLEG): container finished" podID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerID="ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda" exitCode=0 Dec 05 20:09:17 crc kubenswrapper[4885]: I1205 20:09:17.922714 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerDied","Data":"ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda"} Dec 05 20:09:17 crc kubenswrapper[4885]: I1205 20:09:17.925861 4885 generic.go:334] "Generic (PLEG): container finished" podID="f26a71ec-b73f-472e-9f1a-2baae67c0691" containerID="9647a610d3a466e3140679f4c5ff0805df326a39296d3879bd2fa07fcee3188b" exitCode=0 Dec 05 20:09:17 crc kubenswrapper[4885]: I1205 20:09:17.925899 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrdv6" event={"ID":"f26a71ec-b73f-472e-9f1a-2baae67c0691","Type":"ContainerDied","Data":"9647a610d3a466e3140679f4c5ff0805df326a39296d3879bd2fa07fcee3188b"} Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.240311 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.324414 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f26a71ec-b73f-472e-9f1a-2baae67c0691-catalog-content\") pod \"f26a71ec-b73f-472e-9f1a-2baae67c0691\" (UID: \"f26a71ec-b73f-472e-9f1a-2baae67c0691\") " Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.324511 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4qvh\" (UniqueName: \"kubernetes.io/projected/f26a71ec-b73f-472e-9f1a-2baae67c0691-kube-api-access-t4qvh\") pod \"f26a71ec-b73f-472e-9f1a-2baae67c0691\" (UID: \"f26a71ec-b73f-472e-9f1a-2baae67c0691\") " Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.324596 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f26a71ec-b73f-472e-9f1a-2baae67c0691-utilities\") pod \"f26a71ec-b73f-472e-9f1a-2baae67c0691\" (UID: \"f26a71ec-b73f-472e-9f1a-2baae67c0691\") " Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.326277 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f26a71ec-b73f-472e-9f1a-2baae67c0691-utilities" (OuterVolumeSpecName: "utilities") pod "f26a71ec-b73f-472e-9f1a-2baae67c0691" (UID: "f26a71ec-b73f-472e-9f1a-2baae67c0691"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.336347 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f26a71ec-b73f-472e-9f1a-2baae67c0691-kube-api-access-t4qvh" (OuterVolumeSpecName: "kube-api-access-t4qvh") pod "f26a71ec-b73f-472e-9f1a-2baae67c0691" (UID: "f26a71ec-b73f-472e-9f1a-2baae67c0691"). InnerVolumeSpecName "kube-api-access-t4qvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.426686 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f26a71ec-b73f-472e-9f1a-2baae67c0691-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.426728 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4qvh\" (UniqueName: \"kubernetes.io/projected/f26a71ec-b73f-472e-9f1a-2baae67c0691-kube-api-access-t4qvh\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.464983 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f26a71ec-b73f-472e-9f1a-2baae67c0691-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f26a71ec-b73f-472e-9f1a-2baae67c0691" (UID: "f26a71ec-b73f-472e-9f1a-2baae67c0691"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.527688 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f26a71ec-b73f-472e-9f1a-2baae67c0691-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.935059 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"1a568b5e804c681b6f6e3432a30cb5455a65a10c85104a11e96d2fe71376be10"} Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.938112 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrdv6" event={"ID":"f26a71ec-b73f-472e-9f1a-2baae67c0691","Type":"ContainerDied","Data":"de73363c062da1cf5ea99d3a72beab78f1309c22ceddf3e8ade9a8a7e08a0659"} Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.938158 4885 scope.go:117] "RemoveContainer" containerID="9647a610d3a466e3140679f4c5ff0805df326a39296d3879bd2fa07fcee3188b" Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.938235 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrdv6" Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.965267 4885 scope.go:117] "RemoveContainer" containerID="4212a66d5d5f9213143c14cf7747f767d47ca182bda948cf7becb153b707f67f" Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.972871 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rrdv6"] Dec 05 20:09:18 crc kubenswrapper[4885]: I1205 20:09:18.976169 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rrdv6"] Dec 05 20:09:19 crc kubenswrapper[4885]: I1205 20:09:19.002796 4885 scope.go:117] "RemoveContainer" containerID="9b2fb0095c16a396610e3915982bdd5949190e7f9900efc280b37d152aba56b9" Dec 05 20:09:19 crc kubenswrapper[4885]: I1205 20:09:19.179676 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f26a71ec-b73f-472e-9f1a-2baae67c0691" path="/var/lib/kubelet/pods/f26a71ec-b73f-472e-9f1a-2baae67c0691/volumes" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.284244 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" podUID="c6e3f1cc-5218-44b2-b4bf-168dae1629b7" containerName="oauth-openshift" containerID="cri-o://0d04c77a778a3eb8a085b9171fce3ee8783c664fe627eda97cdf2c2855cf1d4e" gracePeriod=15 Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.614619 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.721069 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-audit-policies\") pod \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.721133 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-error\") pod \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.721171 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-session\") pod \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.721232 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2k6v\" (UniqueName: \"kubernetes.io/projected/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-kube-api-access-k2k6v\") pod \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.721827 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c6e3f1cc-5218-44b2-b4bf-168dae1629b7" (UID: "c6e3f1cc-5218-44b2-b4bf-168dae1629b7"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.722179 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-ocp-branding-template\") pod \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.722220 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-router-certs\") pod \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.722263 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-service-ca\") pod \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.722286 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-idp-0-file-data\") pod \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.722307 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-trusted-ca-bundle\") pod \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.722334 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-serving-cert\") pod \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.722361 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-login\") pod \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.722386 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-audit-dir\") pod \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.722411 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-cliconfig\") pod \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.722444 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-provider-selection\") pod \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\" (UID: \"c6e3f1cc-5218-44b2-b4bf-168dae1629b7\") " Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.722664 4885 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.722807 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c6e3f1cc-5218-44b2-b4bf-168dae1629b7" (UID: "c6e3f1cc-5218-44b2-b4bf-168dae1629b7"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.723215 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c6e3f1cc-5218-44b2-b4bf-168dae1629b7" (UID: "c6e3f1cc-5218-44b2-b4bf-168dae1629b7"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.723438 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c6e3f1cc-5218-44b2-b4bf-168dae1629b7" (UID: "c6e3f1cc-5218-44b2-b4bf-168dae1629b7"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.724366 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c6e3f1cc-5218-44b2-b4bf-168dae1629b7" (UID: "c6e3f1cc-5218-44b2-b4bf-168dae1629b7"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.728617 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c6e3f1cc-5218-44b2-b4bf-168dae1629b7" (UID: "c6e3f1cc-5218-44b2-b4bf-168dae1629b7"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.728795 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-kube-api-access-k2k6v" (OuterVolumeSpecName: "kube-api-access-k2k6v") pod "c6e3f1cc-5218-44b2-b4bf-168dae1629b7" (UID: "c6e3f1cc-5218-44b2-b4bf-168dae1629b7"). InnerVolumeSpecName "kube-api-access-k2k6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.729432 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c6e3f1cc-5218-44b2-b4bf-168dae1629b7" (UID: "c6e3f1cc-5218-44b2-b4bf-168dae1629b7"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.729924 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c6e3f1cc-5218-44b2-b4bf-168dae1629b7" (UID: "c6e3f1cc-5218-44b2-b4bf-168dae1629b7"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.730278 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c6e3f1cc-5218-44b2-b4bf-168dae1629b7" (UID: "c6e3f1cc-5218-44b2-b4bf-168dae1629b7"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.730899 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c6e3f1cc-5218-44b2-b4bf-168dae1629b7" (UID: "c6e3f1cc-5218-44b2-b4bf-168dae1629b7"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.735294 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c6e3f1cc-5218-44b2-b4bf-168dae1629b7" (UID: "c6e3f1cc-5218-44b2-b4bf-168dae1629b7"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.735502 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c6e3f1cc-5218-44b2-b4bf-168dae1629b7" (UID: "c6e3f1cc-5218-44b2-b4bf-168dae1629b7"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.735884 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c6e3f1cc-5218-44b2-b4bf-168dae1629b7" (UID: "c6e3f1cc-5218-44b2-b4bf-168dae1629b7"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.824775 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.824830 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.824847 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.824863 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.824879 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.824894 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.824911 4885 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.824929 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.824945 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.824961 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.824989 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.825005 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2k6v\" (UniqueName: \"kubernetes.io/projected/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-kube-api-access-k2k6v\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.825039 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c6e3f1cc-5218-44b2-b4bf-168dae1629b7-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.988807 4885 generic.go:334] "Generic (PLEG): container finished" podID="c6e3f1cc-5218-44b2-b4bf-168dae1629b7" containerID="0d04c77a778a3eb8a085b9171fce3ee8783c664fe627eda97cdf2c2855cf1d4e" exitCode=0 Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.988885 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" event={"ID":"c6e3f1cc-5218-44b2-b4bf-168dae1629b7","Type":"ContainerDied","Data":"0d04c77a778a3eb8a085b9171fce3ee8783c664fe627eda97cdf2c2855cf1d4e"} Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.988967 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" event={"ID":"c6e3f1cc-5218-44b2-b4bf-168dae1629b7","Type":"ContainerDied","Data":"173ae9461c0864f9e339bec53fa259f78761acf0c88a916751f2fbc8d628d20d"} Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.988899 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qcd9b" Dec 05 20:09:25 crc kubenswrapper[4885]: I1205 20:09:25.989012 4885 scope.go:117] "RemoveContainer" containerID="0d04c77a778a3eb8a085b9171fce3ee8783c664fe627eda97cdf2c2855cf1d4e" Dec 05 20:09:26 crc kubenswrapper[4885]: I1205 20:09:26.018616 4885 scope.go:117] "RemoveContainer" containerID="0d04c77a778a3eb8a085b9171fce3ee8783c664fe627eda97cdf2c2855cf1d4e" Dec 05 20:09:26 crc kubenswrapper[4885]: E1205 20:09:26.019278 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d04c77a778a3eb8a085b9171fce3ee8783c664fe627eda97cdf2c2855cf1d4e\": container with ID starting with 0d04c77a778a3eb8a085b9171fce3ee8783c664fe627eda97cdf2c2855cf1d4e not found: ID does not exist" containerID="0d04c77a778a3eb8a085b9171fce3ee8783c664fe627eda97cdf2c2855cf1d4e" Dec 05 20:09:26 crc kubenswrapper[4885]: I1205 20:09:26.019387 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d04c77a778a3eb8a085b9171fce3ee8783c664fe627eda97cdf2c2855cf1d4e"} err="failed to get container status \"0d04c77a778a3eb8a085b9171fce3ee8783c664fe627eda97cdf2c2855cf1d4e\": rpc error: code = NotFound desc = could not find container \"0d04c77a778a3eb8a085b9171fce3ee8783c664fe627eda97cdf2c2855cf1d4e\": container with ID starting with 0d04c77a778a3eb8a085b9171fce3ee8783c664fe627eda97cdf2c2855cf1d4e not found: ID does not exist" Dec 05 20:09:26 crc kubenswrapper[4885]: I1205 20:09:26.037686 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qcd9b"] Dec 05 20:09:26 crc kubenswrapper[4885]: I1205 20:09:26.040913 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qcd9b"] Dec 05 20:09:27 crc kubenswrapper[4885]: I1205 20:09:27.180777 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6e3f1cc-5218-44b2-b4bf-168dae1629b7" path="/var/lib/kubelet/pods/c6e3f1cc-5218-44b2-b4bf-168dae1629b7/volumes" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029133 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5494594499-nfq79"] Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.029374 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bca36c4-e503-4b3f-aaeb-829cebc24e4c" containerName="extract-utilities" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029388 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bca36c4-e503-4b3f-aaeb-829cebc24e4c" containerName="extract-utilities" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.029405 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bca36c4-e503-4b3f-aaeb-829cebc24e4c" containerName="registry-server" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029413 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bca36c4-e503-4b3f-aaeb-829cebc24e4c" containerName="registry-server" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.029425 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bca36c4-e503-4b3f-aaeb-829cebc24e4c" containerName="extract-content" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029433 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bca36c4-e503-4b3f-aaeb-829cebc24e4c" containerName="extract-content" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.029443 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42603535-a30f-41b2-96e3-10f3f8144003" containerName="extract-content" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029451 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="42603535-a30f-41b2-96e3-10f3f8144003" containerName="extract-content" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.029461 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42603535-a30f-41b2-96e3-10f3f8144003" containerName="registry-server" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029469 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="42603535-a30f-41b2-96e3-10f3f8144003" containerName="registry-server" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.029480 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba098ab6-d9df-4d50-aaa6-085658e80871" containerName="registry-server" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029490 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba098ab6-d9df-4d50-aaa6-085658e80871" containerName="registry-server" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.029505 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42603535-a30f-41b2-96e3-10f3f8144003" containerName="extract-utilities" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029514 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="42603535-a30f-41b2-96e3-10f3f8144003" containerName="extract-utilities" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.029527 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba098ab6-d9df-4d50-aaa6-085658e80871" containerName="extract-content" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029537 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba098ab6-d9df-4d50-aaa6-085658e80871" containerName="extract-content" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.029552 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e3f1cc-5218-44b2-b4bf-168dae1629b7" containerName="oauth-openshift" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029560 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e3f1cc-5218-44b2-b4bf-168dae1629b7" containerName="oauth-openshift" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.029570 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26a71ec-b73f-472e-9f1a-2baae67c0691" containerName="extract-utilities" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029577 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26a71ec-b73f-472e-9f1a-2baae67c0691" containerName="extract-utilities" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.029587 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba098ab6-d9df-4d50-aaa6-085658e80871" containerName="extract-utilities" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029596 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba098ab6-d9df-4d50-aaa6-085658e80871" containerName="extract-utilities" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.029604 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26a71ec-b73f-472e-9f1a-2baae67c0691" containerName="extract-content" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029611 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26a71ec-b73f-472e-9f1a-2baae67c0691" containerName="extract-content" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.029621 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26a71ec-b73f-472e-9f1a-2baae67c0691" containerName="registry-server" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029629 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26a71ec-b73f-472e-9f1a-2baae67c0691" containerName="registry-server" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029748 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="42603535-a30f-41b2-96e3-10f3f8144003" containerName="registry-server" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029763 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e3f1cc-5218-44b2-b4bf-168dae1629b7" containerName="oauth-openshift" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029772 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba098ab6-d9df-4d50-aaa6-085658e80871" containerName="registry-server" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029783 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26a71ec-b73f-472e-9f1a-2baae67c0691" containerName="registry-server" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.029794 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bca36c4-e503-4b3f-aaeb-829cebc24e4c" containerName="registry-server" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.030213 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.037687 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.037680 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.038229 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.038508 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.039676 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.040562 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.040690 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.040577 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.040775 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.041842 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.041964 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.042120 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.057361 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.060091 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5494594499-nfq79"] Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.066219 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.086871 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.160509 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-user-template-login\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.160571 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-session\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.160601 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.160632 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3d6a03bc-8326-43ba-8748-1a438eddde7d-audit-policies\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.160734 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.160796 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.160833 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdvfm\" (UniqueName: \"kubernetes.io/projected/3d6a03bc-8326-43ba-8748-1a438eddde7d-kube-api-access-wdvfm\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.160889 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-service-ca\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.161045 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.161161 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-user-template-error\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.161205 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.161264 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3d6a03bc-8326-43ba-8748-1a438eddde7d-audit-dir\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.161306 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.161365 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-router-certs\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.262596 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-service-ca\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.262663 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.262692 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-user-template-error\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.262712 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.262733 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3d6a03bc-8326-43ba-8748-1a438eddde7d-audit-dir\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.262755 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.262785 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-router-certs\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.262837 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-user-template-login\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.262868 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-session\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.262897 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3d6a03bc-8326-43ba-8748-1a438eddde7d-audit-dir\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.262922 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.263101 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3d6a03bc-8326-43ba-8748-1a438eddde7d-audit-policies\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.263158 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.263212 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.263274 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdvfm\" (UniqueName: \"kubernetes.io/projected/3d6a03bc-8326-43ba-8748-1a438eddde7d-kube-api-access-wdvfm\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.263462 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-service-ca\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.263464 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.264184 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3d6a03bc-8326-43ba-8748-1a438eddde7d-audit-policies\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.267289 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.267770 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.268469 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.268604 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-session\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.269851 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.271718 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.273354 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-system-router-certs\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.273505 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-user-template-login\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.275381 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3d6a03bc-8326-43ba-8748-1a438eddde7d-v4-0-config-user-template-error\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.279520 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdvfm\" (UniqueName: \"kubernetes.io/projected/3d6a03bc-8326-43ba-8748-1a438eddde7d-kube-api-access-wdvfm\") pod \"oauth-openshift-5494594499-nfq79\" (UID: \"3d6a03bc-8326-43ba-8748-1a438eddde7d\") " pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.360517 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.540942 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5494594499-nfq79"] Dec 05 20:09:28 crc kubenswrapper[4885]: W1205 20:09:28.547789 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d6a03bc_8326_43ba_8748_1a438eddde7d.slice/crio-dc6a25ba4877d1c7853b678e9421f1cd6314c8fb19504e2717ae92a97e34cbbd WatchSource:0}: Error finding container dc6a25ba4877d1c7853b678e9421f1cd6314c8fb19504e2717ae92a97e34cbbd: Status 404 returned error can't find the container with id dc6a25ba4877d1c7853b678e9421f1cd6314c8fb19504e2717ae92a97e34cbbd Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.633436 4885 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.634180 4885 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.634271 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.634539 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61" gracePeriod=15 Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.634581 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630" gracePeriod=15 Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.634618 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6" gracePeriod=15 Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.634626 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84" gracePeriod=15 Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.634594 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe" gracePeriod=15 Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.635744 4885 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.636005 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.636039 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.636049 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.636056 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.636070 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.636078 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.636096 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.636104 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.636113 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.636121 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.636137 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.636145 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.636152 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.636160 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.636293 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.636304 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.636316 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.636326 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.636337 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.636348 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.665289 4885 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.769642 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.769723 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.769764 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.769813 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.769843 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.769869 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.769910 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.769947 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: E1205 20:09:28.795781 4885 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/events\": dial tcp 38.102.83.164:6443: connect: connection refused" event="&Event{ObjectMeta:{oauth-openshift-5494594499-nfq79.187e6aa6d7e93e70 openshift-authentication 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-authentication,Name:oauth-openshift-5494594499-nfq79,UID:3d6a03bc-8326-43ba-8748-1a438eddde7d,APIVersion:v1,ResourceVersion:29302,FieldPath:spec.containers{oauth-openshift},},Reason:Created,Message:Created container oauth-openshift,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 20:09:28.795315824 +0000 UTC m=+234.092131485,LastTimestamp:2025-12-05 20:09:28.795315824 +0000 UTC m=+234.092131485,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.871397 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.871461 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.871499 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.871526 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.871565 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.871586 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.871610 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.871642 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.871720 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.871763 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.871789 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.871814 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.871838 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.871866 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.871891 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.871916 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: I1205 20:09:28.966141 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:28 crc kubenswrapper[4885]: W1205 20:09:28.989326 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-91607db7865afe637b7aaa00327b349efc711519a75b5549041aee5c611c51bf WatchSource:0}: Error finding container 91607db7865afe637b7aaa00327b349efc711519a75b5549041aee5c611c51bf: Status 404 returned error can't find the container with id 91607db7865afe637b7aaa00327b349efc711519a75b5549041aee5c611c51bf Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.027170 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" event={"ID":"3d6a03bc-8326-43ba-8748-1a438eddde7d","Type":"ContainerStarted","Data":"6b863cf3a08298889e2af79c627c281e6fc2acd33e67bd2862c19fa20feeb89c"} Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.027400 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" event={"ID":"3d6a03bc-8326-43ba-8748-1a438eddde7d","Type":"ContainerStarted","Data":"dc6a25ba4877d1c7853b678e9421f1cd6314c8fb19504e2717ae92a97e34cbbd"} Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.028422 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.028505 4885 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.028662 4885 status_manager.go:851] "Failed to get status for pod" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5494594499-nfq79\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.031955 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.034659 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.035451 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6" exitCode=0 Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.035483 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe" exitCode=0 Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.035496 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630" exitCode=0 Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.035506 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84" exitCode=2 Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.035581 4885 scope.go:117] "RemoveContainer" containerID="e296309c85073fbaf40d360dc8ad9d597d65033ad6aa5e4250dbc542b9eb2982" Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.037991 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa54f954-f05a-44b2-8f26-4a9990d44845" containerID="f1c299c0d56a529caf2c9ce34b6fff3d0defa222c28bca3afea01cb17fbd6dfa" exitCode=0 Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.038055 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"aa54f954-f05a-44b2-8f26-4a9990d44845","Type":"ContainerDied","Data":"f1c299c0d56a529caf2c9ce34b6fff3d0defa222c28bca3afea01cb17fbd6dfa"} Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.038729 4885 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.038971 4885 status_manager.go:851] "Failed to get status for pod" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.038989 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"91607db7865afe637b7aaa00327b349efc711519a75b5549041aee5c611c51bf"} Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.039231 4885 status_manager.go:851] "Failed to get status for pod" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5494594499-nfq79\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.309844 4885 patch_prober.go:28] interesting pod/oauth-openshift-5494594499-nfq79 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": read tcp 10.217.0.2:43768->10.217.0.56:6443: read: connection reset by peer" start-of-body= Dec 05 20:09:29 crc kubenswrapper[4885]: I1205 20:09:29.309897 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": read tcp 10.217.0.2:43768->10.217.0.56:6443: read: connection reset by peer" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.050770 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"025a4d3862b2df814c65a70f11b813bdeb5369fe2b69f86e339a07f8bdd1ab20"} Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.051545 4885 status_manager.go:851] "Failed to get status for pod" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:30 crc kubenswrapper[4885]: E1205 20:09:30.051603 4885 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.051963 4885 status_manager.go:851] "Failed to get status for pod" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5494594499-nfq79\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.052976 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5494594499-nfq79_3d6a03bc-8326-43ba-8748-1a438eddde7d/oauth-openshift/0.log" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.053081 4885 generic.go:334] "Generic (PLEG): container finished" podID="3d6a03bc-8326-43ba-8748-1a438eddde7d" containerID="6b863cf3a08298889e2af79c627c281e6fc2acd33e67bd2862c19fa20feeb89c" exitCode=255 Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.053125 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" event={"ID":"3d6a03bc-8326-43ba-8748-1a438eddde7d","Type":"ContainerDied","Data":"6b863cf3a08298889e2af79c627c281e6fc2acd33e67bd2862c19fa20feeb89c"} Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.053560 4885 scope.go:117] "RemoveContainer" containerID="6b863cf3a08298889e2af79c627c281e6fc2acd33e67bd2862c19fa20feeb89c" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.053729 4885 status_manager.go:851] "Failed to get status for pod" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.054047 4885 status_manager.go:851] "Failed to get status for pod" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5494594499-nfq79\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.056162 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 20:09:30 crc kubenswrapper[4885]: E1205 20:09:30.199233 4885 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:30 crc kubenswrapper[4885]: E1205 20:09:30.200357 4885 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:30 crc kubenswrapper[4885]: E1205 20:09:30.201057 4885 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:30 crc kubenswrapper[4885]: E1205 20:09:30.201392 4885 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:30 crc kubenswrapper[4885]: E1205 20:09:30.201674 4885 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.201704 4885 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 05 20:09:30 crc kubenswrapper[4885]: E1205 20:09:30.201920 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="200ms" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.280919 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.281565 4885 status_manager.go:851] "Failed to get status for pod" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5494594499-nfq79\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.282125 4885 status_manager.go:851] "Failed to get status for pod" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.391804 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa54f954-f05a-44b2-8f26-4a9990d44845-kubelet-dir\") pod \"aa54f954-f05a-44b2-8f26-4a9990d44845\" (UID: \"aa54f954-f05a-44b2-8f26-4a9990d44845\") " Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.391879 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aa54f954-f05a-44b2-8f26-4a9990d44845-var-lock\") pod \"aa54f954-f05a-44b2-8f26-4a9990d44845\" (UID: \"aa54f954-f05a-44b2-8f26-4a9990d44845\") " Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.391944 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa54f954-f05a-44b2-8f26-4a9990d44845-kube-api-access\") pod \"aa54f954-f05a-44b2-8f26-4a9990d44845\" (UID: \"aa54f954-f05a-44b2-8f26-4a9990d44845\") " Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.391946 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa54f954-f05a-44b2-8f26-4a9990d44845-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "aa54f954-f05a-44b2-8f26-4a9990d44845" (UID: "aa54f954-f05a-44b2-8f26-4a9990d44845"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.392053 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa54f954-f05a-44b2-8f26-4a9990d44845-var-lock" (OuterVolumeSpecName: "var-lock") pod "aa54f954-f05a-44b2-8f26-4a9990d44845" (UID: "aa54f954-f05a-44b2-8f26-4a9990d44845"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.392362 4885 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa54f954-f05a-44b2-8f26-4a9990d44845-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.392381 4885 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aa54f954-f05a-44b2-8f26-4a9990d44845-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.398298 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa54f954-f05a-44b2-8f26-4a9990d44845-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "aa54f954-f05a-44b2-8f26-4a9990d44845" (UID: "aa54f954-f05a-44b2-8f26-4a9990d44845"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:09:30 crc kubenswrapper[4885]: E1205 20:09:30.402539 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="400ms" Dec 05 20:09:30 crc kubenswrapper[4885]: I1205 20:09:30.493850 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa54f954-f05a-44b2-8f26-4a9990d44845-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:30 crc kubenswrapper[4885]: E1205 20:09:30.698291 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:09:30Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:09:30Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:09:30Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:09:30Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:30 crc kubenswrapper[4885]: E1205 20:09:30.698991 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:30 crc kubenswrapper[4885]: E1205 20:09:30.699382 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:30 crc kubenswrapper[4885]: E1205 20:09:30.699891 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:30 crc kubenswrapper[4885]: E1205 20:09:30.704610 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:30 crc kubenswrapper[4885]: E1205 20:09:30.704643 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:09:30 crc kubenswrapper[4885]: E1205 20:09:30.803135 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="800ms" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.037930 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.039236 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.040289 4885 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.040832 4885 status_manager.go:851] "Failed to get status for pod" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.041392 4885 status_manager.go:851] "Failed to get status for pod" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5494594499-nfq79\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.066346 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5494594499-nfq79_3d6a03bc-8326-43ba-8748-1a438eddde7d/oauth-openshift/1.log" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.067918 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5494594499-nfq79_3d6a03bc-8326-43ba-8748-1a438eddde7d/oauth-openshift/0.log" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.067992 4885 generic.go:334] "Generic (PLEG): container finished" podID="3d6a03bc-8326-43ba-8748-1a438eddde7d" containerID="bf667db8ba60f5f0fdb596eb7fbe184be0f3a0506adb4cc9eed4c54fcdba9fa8" exitCode=255 Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.068108 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" event={"ID":"3d6a03bc-8326-43ba-8748-1a438eddde7d","Type":"ContainerDied","Data":"bf667db8ba60f5f0fdb596eb7fbe184be0f3a0506adb4cc9eed4c54fcdba9fa8"} Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.068326 4885 scope.go:117] "RemoveContainer" containerID="6b863cf3a08298889e2af79c627c281e6fc2acd33e67bd2862c19fa20feeb89c" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.068847 4885 scope.go:117] "RemoveContainer" containerID="bf667db8ba60f5f0fdb596eb7fbe184be0f3a0506adb4cc9eed4c54fcdba9fa8" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.068885 4885 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.069300 4885 status_manager.go:851] "Failed to get status for pod" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:31 crc kubenswrapper[4885]: E1205 20:09:31.069366 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-5494594499-nfq79_openshift-authentication(3d6a03bc-8326-43ba-8748-1a438eddde7d)\"" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.069974 4885 status_manager.go:851] "Failed to get status for pod" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5494594499-nfq79\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.072902 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.074264 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61" exitCode=0 Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.074533 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.077054 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.079120 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"aa54f954-f05a-44b2-8f26-4a9990d44845","Type":"ContainerDied","Data":"816d51aa00b2bc301886f30342114e72827a0b98b240d46a66d1c4887aade387"} Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.079167 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="816d51aa00b2bc301886f30342114e72827a0b98b240d46a66d1c4887aade387" Dec 05 20:09:31 crc kubenswrapper[4885]: E1205 20:09:31.079225 4885 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.105200 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.105275 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.105324 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.105356 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.105370 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.105470 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.105647 4885 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.105663 4885 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.105674 4885 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.105671 4885 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.106349 4885 status_manager.go:851] "Failed to get status for pod" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.106613 4885 status_manager.go:851] "Failed to get status for pod" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5494594499-nfq79\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.118436 4885 scope.go:117] "RemoveContainer" containerID="247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.148305 4885 scope.go:117] "RemoveContainer" containerID="d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.168089 4885 scope.go:117] "RemoveContainer" containerID="2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.192115 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.202410 4885 scope.go:117] "RemoveContainer" containerID="e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.220831 4885 scope.go:117] "RemoveContainer" containerID="c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.240968 4885 scope.go:117] "RemoveContainer" containerID="bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.267706 4885 scope.go:117] "RemoveContainer" containerID="247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6" Dec 05 20:09:31 crc kubenswrapper[4885]: E1205 20:09:31.268315 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\": container with ID starting with 247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6 not found: ID does not exist" containerID="247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.268457 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6"} err="failed to get container status \"247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\": rpc error: code = NotFound desc = could not find container \"247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6\": container with ID starting with 247620dd3715e77d19adca02f41ae346b3ac916aa4771210c893de4f79dbd3e6 not found: ID does not exist" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.268564 4885 scope.go:117] "RemoveContainer" containerID="d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe" Dec 05 20:09:31 crc kubenswrapper[4885]: E1205 20:09:31.269318 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\": container with ID starting with d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe not found: ID does not exist" containerID="d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.269376 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe"} err="failed to get container status \"d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\": rpc error: code = NotFound desc = could not find container \"d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe\": container with ID starting with d99be042480732efdc902db0d647aed2dd99fb55e0af1274b3d51e04eddbdbbe not found: ID does not exist" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.269410 4885 scope.go:117] "RemoveContainer" containerID="2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630" Dec 05 20:09:31 crc kubenswrapper[4885]: E1205 20:09:31.269947 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\": container with ID starting with 2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630 not found: ID does not exist" containerID="2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.269997 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630"} err="failed to get container status \"2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\": rpc error: code = NotFound desc = could not find container \"2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630\": container with ID starting with 2feb5b84bd5d2403f56823d86b79b37b9688de967d2e1b6975628ba754db5630 not found: ID does not exist" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.270055 4885 scope.go:117] "RemoveContainer" containerID="e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84" Dec 05 20:09:31 crc kubenswrapper[4885]: E1205 20:09:31.270442 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\": container with ID starting with e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84 not found: ID does not exist" containerID="e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.270601 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84"} err="failed to get container status \"e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\": rpc error: code = NotFound desc = could not find container \"e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84\": container with ID starting with e264989df2babf72c42071e8ade3f691172dd9f4e6242e2a616cb296d8dfdf84 not found: ID does not exist" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.270788 4885 scope.go:117] "RemoveContainer" containerID="c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61" Dec 05 20:09:31 crc kubenswrapper[4885]: E1205 20:09:31.271295 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\": container with ID starting with c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61 not found: ID does not exist" containerID="c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.271335 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61"} err="failed to get container status \"c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\": rpc error: code = NotFound desc = could not find container \"c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61\": container with ID starting with c073434aa8bf3ec8f1edf79f38a3d25b691805003261eabcaffd236a29942d61 not found: ID does not exist" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.271362 4885 scope.go:117] "RemoveContainer" containerID="bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d" Dec 05 20:09:31 crc kubenswrapper[4885]: E1205 20:09:31.271746 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\": container with ID starting with bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d not found: ID does not exist" containerID="bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.271913 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d"} err="failed to get container status \"bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\": rpc error: code = NotFound desc = could not find container \"bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d\": container with ID starting with bb14849f79cf0e36695d774e4bfd568ba745392ab0f4667e5244e4181f70941d not found: ID does not exist" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.381216 4885 status_manager.go:851] "Failed to get status for pod" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5494594499-nfq79\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.381892 4885 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:31 crc kubenswrapper[4885]: I1205 20:09:31.382285 4885 status_manager.go:851] "Failed to get status for pod" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:31 crc kubenswrapper[4885]: E1205 20:09:31.604447 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="1.6s" Dec 05 20:09:32 crc kubenswrapper[4885]: I1205 20:09:32.083253 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5494594499-nfq79_3d6a03bc-8326-43ba-8748-1a438eddde7d/oauth-openshift/1.log" Dec 05 20:09:32 crc kubenswrapper[4885]: I1205 20:09:32.083974 4885 scope.go:117] "RemoveContainer" containerID="bf667db8ba60f5f0fdb596eb7fbe184be0f3a0506adb4cc9eed4c54fcdba9fa8" Dec 05 20:09:32 crc kubenswrapper[4885]: I1205 20:09:32.084012 4885 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:32 crc kubenswrapper[4885]: E1205 20:09:32.084202 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-5494594499-nfq79_openshift-authentication(3d6a03bc-8326-43ba-8748-1a438eddde7d)\"" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" Dec 05 20:09:32 crc kubenswrapper[4885]: I1205 20:09:32.084721 4885 status_manager.go:851] "Failed to get status for pod" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:32 crc kubenswrapper[4885]: I1205 20:09:32.084941 4885 status_manager.go:851] "Failed to get status for pod" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5494594499-nfq79\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:33 crc kubenswrapper[4885]: E1205 20:09:33.173851 4885 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-bctth" volumeName="registry-storage" Dec 05 20:09:33 crc kubenswrapper[4885]: E1205 20:09:33.206331 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="3.2s" Dec 05 20:09:35 crc kubenswrapper[4885]: I1205 20:09:35.177544 4885 status_manager.go:851] "Failed to get status for pod" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5494594499-nfq79\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:35 crc kubenswrapper[4885]: I1205 20:09:35.178008 4885 status_manager.go:851] "Failed to get status for pod" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:36 crc kubenswrapper[4885]: E1205 20:09:36.407483 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="6.4s" Dec 05 20:09:36 crc kubenswrapper[4885]: E1205 20:09:36.561281 4885 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/events\": dial tcp 38.102.83.164:6443: connect: connection refused" event="&Event{ObjectMeta:{oauth-openshift-5494594499-nfq79.187e6aa6d7e93e70 openshift-authentication 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-authentication,Name:oauth-openshift-5494594499-nfq79,UID:3d6a03bc-8326-43ba-8748-1a438eddde7d,APIVersion:v1,ResourceVersion:29302,FieldPath:spec.containers{oauth-openshift},},Reason:Created,Message:Created container oauth-openshift,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 20:09:28.795315824 +0000 UTC m=+234.092131485,LastTimestamp:2025-12-05 20:09:28.795315824 +0000 UTC m=+234.092131485,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 20:09:38 crc kubenswrapper[4885]: I1205 20:09:38.361640 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:38 crc kubenswrapper[4885]: I1205 20:09:38.363235 4885 scope.go:117] "RemoveContainer" containerID="bf667db8ba60f5f0fdb596eb7fbe184be0f3a0506adb4cc9eed4c54fcdba9fa8" Dec 05 20:09:38 crc kubenswrapper[4885]: E1205 20:09:38.363465 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-5494594499-nfq79_openshift-authentication(3d6a03bc-8326-43ba-8748-1a438eddde7d)\"" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" Dec 05 20:09:38 crc kubenswrapper[4885]: I1205 20:09:38.363553 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:39 crc kubenswrapper[4885]: I1205 20:09:39.128248 4885 scope.go:117] "RemoveContainer" containerID="bf667db8ba60f5f0fdb596eb7fbe184be0f3a0506adb4cc9eed4c54fcdba9fa8" Dec 05 20:09:39 crc kubenswrapper[4885]: E1205 20:09:39.129109 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-5494594499-nfq79_openshift-authentication(3d6a03bc-8326-43ba-8748-1a438eddde7d)\"" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" Dec 05 20:09:42 crc kubenswrapper[4885]: I1205 20:09:42.149978 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 20:09:42 crc kubenswrapper[4885]: I1205 20:09:42.150214 4885 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348" exitCode=1 Dec 05 20:09:42 crc kubenswrapper[4885]: I1205 20:09:42.150268 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348"} Dec 05 20:09:42 crc kubenswrapper[4885]: I1205 20:09:42.150967 4885 scope.go:117] "RemoveContainer" containerID="76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348" Dec 05 20:09:42 crc kubenswrapper[4885]: I1205 20:09:42.151523 4885 status_manager.go:851] "Failed to get status for pod" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5494594499-nfq79\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:42 crc kubenswrapper[4885]: I1205 20:09:42.152030 4885 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:42 crc kubenswrapper[4885]: I1205 20:09:42.152493 4885 status_manager.go:851] "Failed to get status for pod" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:42 crc kubenswrapper[4885]: E1205 20:09:42.809510 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="7s" Dec 05 20:09:43 crc kubenswrapper[4885]: I1205 20:09:43.162776 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 20:09:43 crc kubenswrapper[4885]: I1205 20:09:43.163934 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"81544f0d363a8d8395e553f517fd04611bd8d0c00f523a8d055c402f9b2d1cb2"} Dec 05 20:09:43 crc kubenswrapper[4885]: I1205 20:09:43.165474 4885 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:43 crc kubenswrapper[4885]: I1205 20:09:43.166060 4885 status_manager.go:851] "Failed to get status for pod" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:43 crc kubenswrapper[4885]: I1205 20:09:43.166542 4885 status_manager.go:851] "Failed to get status for pod" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5494594499-nfq79\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:44 crc kubenswrapper[4885]: I1205 20:09:44.172454 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:44 crc kubenswrapper[4885]: I1205 20:09:44.175276 4885 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:44 crc kubenswrapper[4885]: I1205 20:09:44.176092 4885 status_manager.go:851] "Failed to get status for pod" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:44 crc kubenswrapper[4885]: I1205 20:09:44.176786 4885 status_manager.go:851] "Failed to get status for pod" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5494594499-nfq79\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:44 crc kubenswrapper[4885]: I1205 20:09:44.197067 4885 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e003c8d-46a7-4194-b63b-100b1d5af08e" Dec 05 20:09:44 crc kubenswrapper[4885]: I1205 20:09:44.197106 4885 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e003c8d-46a7-4194-b63b-100b1d5af08e" Dec 05 20:09:44 crc kubenswrapper[4885]: E1205 20:09:44.198059 4885 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:44 crc kubenswrapper[4885]: I1205 20:09:44.198850 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:44 crc kubenswrapper[4885]: I1205 20:09:44.508558 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:09:45 crc kubenswrapper[4885]: I1205 20:09:45.181404 4885 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:45 crc kubenswrapper[4885]: I1205 20:09:45.183061 4885 status_manager.go:851] "Failed to get status for pod" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:45 crc kubenswrapper[4885]: I1205 20:09:45.183703 4885 status_manager.go:851] "Failed to get status for pod" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5494594499-nfq79\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:45 crc kubenswrapper[4885]: I1205 20:09:45.184522 4885 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:45 crc kubenswrapper[4885]: I1205 20:09:45.188929 4885 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="2e0e6526a6e5b50b2083465b5da3761edf122d39a90573849461b530d0b9a448" exitCode=0 Dec 05 20:09:45 crc kubenswrapper[4885]: I1205 20:09:45.188976 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"2e0e6526a6e5b50b2083465b5da3761edf122d39a90573849461b530d0b9a448"} Dec 05 20:09:45 crc kubenswrapper[4885]: I1205 20:09:45.189109 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"11458e491023870d7711445884e9ae405804072df60f9280324cb05bea909fdc"} Dec 05 20:09:45 crc kubenswrapper[4885]: I1205 20:09:45.189499 4885 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e003c8d-46a7-4194-b63b-100b1d5af08e" Dec 05 20:09:45 crc kubenswrapper[4885]: I1205 20:09:45.189536 4885 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e003c8d-46a7-4194-b63b-100b1d5af08e" Dec 05 20:09:45 crc kubenswrapper[4885]: E1205 20:09:45.190013 4885 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:45 crc kubenswrapper[4885]: I1205 20:09:45.190016 4885 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:45 crc kubenswrapper[4885]: I1205 20:09:45.191354 4885 status_manager.go:851] "Failed to get status for pod" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:45 crc kubenswrapper[4885]: I1205 20:09:45.192593 4885 status_manager.go:851] "Failed to get status for pod" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5494594499-nfq79\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:45 crc kubenswrapper[4885]: I1205 20:09:45.193962 4885 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Dec 05 20:09:46 crc kubenswrapper[4885]: I1205 20:09:46.206499 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"22a58177264bddba1e54b4aaaa78828b8622b48f78ebfc4d655c9c9d2a9cb037"} Dec 05 20:09:46 crc kubenswrapper[4885]: I1205 20:09:46.206766 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ceb93221bf4ecc9606758fd3d0c485308eee517f0cc51a282977f575c7998d88"} Dec 05 20:09:46 crc kubenswrapper[4885]: I1205 20:09:46.206782 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1fa4ba4080e023b60151655af70d6e492d08ec05e054d8f24b863940d1f82b41"} Dec 05 20:09:46 crc kubenswrapper[4885]: I1205 20:09:46.206797 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6531dbabe2f6bfe1d4b7ae93e0e2c71cfad512d426c994c912b99d2a61dbd5e4"} Dec 05 20:09:46 crc kubenswrapper[4885]: I1205 20:09:46.326012 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:09:46 crc kubenswrapper[4885]: I1205 20:09:46.326366 4885 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 20:09:46 crc kubenswrapper[4885]: I1205 20:09:46.326411 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 20:09:47 crc kubenswrapper[4885]: I1205 20:09:47.217000 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8b20b0a98c8339635891fcdddfe7482b731a666f57065afff05521068f484c70"} Dec 05 20:09:47 crc kubenswrapper[4885]: I1205 20:09:47.217533 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:47 crc kubenswrapper[4885]: I1205 20:09:47.217538 4885 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e003c8d-46a7-4194-b63b-100b1d5af08e" Dec 05 20:09:47 crc kubenswrapper[4885]: I1205 20:09:47.217579 4885 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e003c8d-46a7-4194-b63b-100b1d5af08e" Dec 05 20:09:49 crc kubenswrapper[4885]: I1205 20:09:49.199420 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:49 crc kubenswrapper[4885]: I1205 20:09:49.199825 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:49 crc kubenswrapper[4885]: I1205 20:09:49.208180 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:52 crc kubenswrapper[4885]: I1205 20:09:52.230941 4885 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:52 crc kubenswrapper[4885]: I1205 20:09:52.276316 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a9bf69ce-c983-4b0f-8ef8-b5681f761f58" Dec 05 20:09:53 crc kubenswrapper[4885]: I1205 20:09:53.249168 4885 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e003c8d-46a7-4194-b63b-100b1d5af08e" Dec 05 20:09:53 crc kubenswrapper[4885]: I1205 20:09:53.249199 4885 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e003c8d-46a7-4194-b63b-100b1d5af08e" Dec 05 20:09:53 crc kubenswrapper[4885]: I1205 20:09:53.254238 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a9bf69ce-c983-4b0f-8ef8-b5681f761f58" Dec 05 20:09:53 crc kubenswrapper[4885]: I1205 20:09:53.258591 4885 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://6531dbabe2f6bfe1d4b7ae93e0e2c71cfad512d426c994c912b99d2a61dbd5e4" Dec 05 20:09:53 crc kubenswrapper[4885]: I1205 20:09:53.258652 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:09:54 crc kubenswrapper[4885]: I1205 20:09:54.173094 4885 scope.go:117] "RemoveContainer" containerID="bf667db8ba60f5f0fdb596eb7fbe184be0f3a0506adb4cc9eed4c54fcdba9fa8" Dec 05 20:09:54 crc kubenswrapper[4885]: I1205 20:09:54.257558 4885 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e003c8d-46a7-4194-b63b-100b1d5af08e" Dec 05 20:09:54 crc kubenswrapper[4885]: I1205 20:09:54.257585 4885 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e003c8d-46a7-4194-b63b-100b1d5af08e" Dec 05 20:09:54 crc kubenswrapper[4885]: I1205 20:09:54.260699 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a9bf69ce-c983-4b0f-8ef8-b5681f761f58" Dec 05 20:09:55 crc kubenswrapper[4885]: I1205 20:09:55.267427 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5494594499-nfq79_3d6a03bc-8326-43ba-8748-1a438eddde7d/oauth-openshift/2.log" Dec 05 20:09:55 crc kubenswrapper[4885]: I1205 20:09:55.268956 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5494594499-nfq79_3d6a03bc-8326-43ba-8748-1a438eddde7d/oauth-openshift/1.log" Dec 05 20:09:55 crc kubenswrapper[4885]: I1205 20:09:55.269043 4885 generic.go:334] "Generic (PLEG): container finished" podID="3d6a03bc-8326-43ba-8748-1a438eddde7d" containerID="fcf75bbdaed56f54b635f11072f5f368909797bcb19b743b62b783b7d7e3d17c" exitCode=255 Dec 05 20:09:55 crc kubenswrapper[4885]: I1205 20:09:55.269085 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" event={"ID":"3d6a03bc-8326-43ba-8748-1a438eddde7d","Type":"ContainerDied","Data":"fcf75bbdaed56f54b635f11072f5f368909797bcb19b743b62b783b7d7e3d17c"} Dec 05 20:09:55 crc kubenswrapper[4885]: I1205 20:09:55.269137 4885 scope.go:117] "RemoveContainer" containerID="bf667db8ba60f5f0fdb596eb7fbe184be0f3a0506adb4cc9eed4c54fcdba9fa8" Dec 05 20:09:55 crc kubenswrapper[4885]: I1205 20:09:55.269762 4885 scope.go:117] "RemoveContainer" containerID="fcf75bbdaed56f54b635f11072f5f368909797bcb19b743b62b783b7d7e3d17c" Dec 05 20:09:55 crc kubenswrapper[4885]: E1205 20:09:55.270120 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-5494594499-nfq79_openshift-authentication(3d6a03bc-8326-43ba-8748-1a438eddde7d)\"" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" Dec 05 20:09:56 crc kubenswrapper[4885]: I1205 20:09:56.279767 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5494594499-nfq79_3d6a03bc-8326-43ba-8748-1a438eddde7d/oauth-openshift/2.log" Dec 05 20:09:56 crc kubenswrapper[4885]: I1205 20:09:56.325607 4885 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 20:09:56 crc kubenswrapper[4885]: I1205 20:09:56.325676 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 20:09:58 crc kubenswrapper[4885]: I1205 20:09:58.361707 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:58 crc kubenswrapper[4885]: I1205 20:09:58.361838 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:09:58 crc kubenswrapper[4885]: I1205 20:09:58.362486 4885 scope.go:117] "RemoveContainer" containerID="fcf75bbdaed56f54b635f11072f5f368909797bcb19b743b62b783b7d7e3d17c" Dec 05 20:09:58 crc kubenswrapper[4885]: E1205 20:09:58.362788 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-5494594499-nfq79_openshift-authentication(3d6a03bc-8326-43ba-8748-1a438eddde7d)\"" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" Dec 05 20:09:59 crc kubenswrapper[4885]: I1205 20:09:59.297181 4885 scope.go:117] "RemoveContainer" containerID="fcf75bbdaed56f54b635f11072f5f368909797bcb19b743b62b783b7d7e3d17c" Dec 05 20:09:59 crc kubenswrapper[4885]: E1205 20:09:59.297841 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-5494594499-nfq79_openshift-authentication(3d6a03bc-8326-43ba-8748-1a438eddde7d)\"" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" Dec 05 20:10:01 crc kubenswrapper[4885]: I1205 20:10:01.382617 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 20:10:01 crc kubenswrapper[4885]: I1205 20:10:01.454103 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 20:10:02 crc kubenswrapper[4885]: I1205 20:10:02.115778 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 20:10:02 crc kubenswrapper[4885]: I1205 20:10:02.642563 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 20:10:02 crc kubenswrapper[4885]: I1205 20:10:02.790750 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 20:10:02 crc kubenswrapper[4885]: I1205 20:10:02.798432 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 20:10:02 crc kubenswrapper[4885]: I1205 20:10:02.961198 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 20:10:02 crc kubenswrapper[4885]: I1205 20:10:02.968387 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 20:10:02 crc kubenswrapper[4885]: I1205 20:10:02.971925 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 20:10:03 crc kubenswrapper[4885]: I1205 20:10:03.232743 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 20:10:03 crc kubenswrapper[4885]: I1205 20:10:03.283643 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 20:10:03 crc kubenswrapper[4885]: I1205 20:10:03.364548 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 20:10:03 crc kubenswrapper[4885]: I1205 20:10:03.443643 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 20:10:03 crc kubenswrapper[4885]: I1205 20:10:03.473333 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 20:10:03 crc kubenswrapper[4885]: I1205 20:10:03.600149 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 20:10:03 crc kubenswrapper[4885]: I1205 20:10:03.674626 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 20:10:03 crc kubenswrapper[4885]: I1205 20:10:03.856515 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 20:10:03 crc kubenswrapper[4885]: I1205 20:10:03.933055 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 20:10:04 crc kubenswrapper[4885]: I1205 20:10:04.088719 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 20:10:04 crc kubenswrapper[4885]: I1205 20:10:04.199498 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 20:10:04 crc kubenswrapper[4885]: I1205 20:10:04.276895 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 20:10:04 crc kubenswrapper[4885]: I1205 20:10:04.391532 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 20:10:04 crc kubenswrapper[4885]: I1205 20:10:04.469865 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 20:10:04 crc kubenswrapper[4885]: I1205 20:10:04.553175 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 20:10:04 crc kubenswrapper[4885]: I1205 20:10:04.563956 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 20:10:04 crc kubenswrapper[4885]: I1205 20:10:04.730146 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 20:10:04 crc kubenswrapper[4885]: I1205 20:10:04.755070 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 20:10:04 crc kubenswrapper[4885]: I1205 20:10:04.815347 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 20:10:04 crc kubenswrapper[4885]: I1205 20:10:04.819534 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 20:10:04 crc kubenswrapper[4885]: I1205 20:10:04.851224 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 20:10:04 crc kubenswrapper[4885]: I1205 20:10:04.864895 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 20:10:04 crc kubenswrapper[4885]: I1205 20:10:04.906428 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 20:10:04 crc kubenswrapper[4885]: I1205 20:10:04.931430 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 20:10:04 crc kubenswrapper[4885]: I1205 20:10:04.986948 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 20:10:05 crc kubenswrapper[4885]: I1205 20:10:05.083961 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 20:10:05 crc kubenswrapper[4885]: I1205 20:10:05.140215 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 20:10:05 crc kubenswrapper[4885]: I1205 20:10:05.200388 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 20:10:05 crc kubenswrapper[4885]: I1205 20:10:05.383181 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 20:10:05 crc kubenswrapper[4885]: I1205 20:10:05.579909 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 20:10:05 crc kubenswrapper[4885]: I1205 20:10:05.653441 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 20:10:05 crc kubenswrapper[4885]: I1205 20:10:05.657417 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 20:10:05 crc kubenswrapper[4885]: I1205 20:10:05.722232 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 20:10:05 crc kubenswrapper[4885]: I1205 20:10:05.860099 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 20:10:05 crc kubenswrapper[4885]: I1205 20:10:05.900381 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 20:10:05 crc kubenswrapper[4885]: I1205 20:10:05.905831 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 20:10:05 crc kubenswrapper[4885]: I1205 20:10:05.919128 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 20:10:05 crc kubenswrapper[4885]: I1205 20:10:05.989967 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.015571 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.088226 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.104301 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.326081 4885 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.326135 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.326191 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.326288 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.327412 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"81544f0d363a8d8395e553f517fd04611bd8d0c00f523a8d055c402f9b2d1cb2"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.327684 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://81544f0d363a8d8395e553f517fd04611bd8d0c00f523a8d055c402f9b2d1cb2" gracePeriod=30 Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.445788 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.532287 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.554599 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.556483 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.573218 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.613112 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.635704 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.847827 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.862600 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.892217 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.933727 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.938705 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.989658 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 20:10:06 crc kubenswrapper[4885]: I1205 20:10:06.997835 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 20:10:07 crc kubenswrapper[4885]: I1205 20:10:07.036987 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 20:10:07 crc kubenswrapper[4885]: I1205 20:10:07.100885 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 20:10:07 crc kubenswrapper[4885]: I1205 20:10:07.101497 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 20:10:07 crc kubenswrapper[4885]: I1205 20:10:07.377168 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 20:10:07 crc kubenswrapper[4885]: I1205 20:10:07.409741 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 20:10:07 crc kubenswrapper[4885]: I1205 20:10:07.459448 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 20:10:07 crc kubenswrapper[4885]: I1205 20:10:07.540378 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 20:10:07 crc kubenswrapper[4885]: I1205 20:10:07.542679 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 20:10:07 crc kubenswrapper[4885]: I1205 20:10:07.759448 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 20:10:07 crc kubenswrapper[4885]: I1205 20:10:07.759570 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 20:10:07 crc kubenswrapper[4885]: I1205 20:10:07.779342 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 20:10:07 crc kubenswrapper[4885]: I1205 20:10:07.814418 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 20:10:07 crc kubenswrapper[4885]: I1205 20:10:07.887104 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 20:10:07 crc kubenswrapper[4885]: I1205 20:10:07.964753 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 20:10:07 crc kubenswrapper[4885]: I1205 20:10:07.991193 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.040315 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.084689 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.096865 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.115245 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.139974 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.162409 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.258956 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.335881 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.341648 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.383488 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.385663 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.413822 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.423064 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.434273 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.455516 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.571319 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.663443 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.712683 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.713819 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.809706 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.811473 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.823637 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.869734 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.912625 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 20:10:08 crc kubenswrapper[4885]: I1205 20:10:08.962556 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.024243 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.050498 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.075727 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.076483 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.147746 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.186254 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.188781 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.288543 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.302073 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.344869 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.419385 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.435273 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.543469 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.609097 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.678711 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.692341 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.706763 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.743282 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.745115 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.767606 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.780316 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.781741 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.808499 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.837551 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.929144 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.935126 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.996331 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 20:10:09 crc kubenswrapper[4885]: I1205 20:10:09.997359 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 20:10:10 crc kubenswrapper[4885]: I1205 20:10:10.005237 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 20:10:10 crc kubenswrapper[4885]: I1205 20:10:10.068164 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 20:10:10 crc kubenswrapper[4885]: I1205 20:10:10.079196 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 20:10:10 crc kubenswrapper[4885]: I1205 20:10:10.115488 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 20:10:10 crc kubenswrapper[4885]: I1205 20:10:10.420168 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 20:10:10 crc kubenswrapper[4885]: I1205 20:10:10.473487 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 20:10:10 crc kubenswrapper[4885]: I1205 20:10:10.627839 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 20:10:10 crc kubenswrapper[4885]: I1205 20:10:10.677545 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 20:10:10 crc kubenswrapper[4885]: I1205 20:10:10.734090 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 20:10:10 crc kubenswrapper[4885]: I1205 20:10:10.840010 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 20:10:10 crc kubenswrapper[4885]: I1205 20:10:10.895571 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 20:10:10 crc kubenswrapper[4885]: I1205 20:10:10.901349 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.064317 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.075889 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.161065 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.180167 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.196262 4885 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.285099 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.341940 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.378039 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.412716 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.443895 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.465679 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.573670 4885 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.701950 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.734590 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.755788 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.772143 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.794664 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.797262 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 20:10:11 crc kubenswrapper[4885]: I1205 20:10:11.941924 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 20:10:12 crc kubenswrapper[4885]: I1205 20:10:12.012000 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 20:10:12 crc kubenswrapper[4885]: I1205 20:10:12.016675 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 20:10:12 crc kubenswrapper[4885]: I1205 20:10:12.024346 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 20:10:12 crc kubenswrapper[4885]: I1205 20:10:12.101348 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 20:10:12 crc kubenswrapper[4885]: I1205 20:10:12.117760 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 20:10:12 crc kubenswrapper[4885]: I1205 20:10:12.214966 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 20:10:12 crc kubenswrapper[4885]: I1205 20:10:12.316108 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 20:10:12 crc kubenswrapper[4885]: I1205 20:10:12.396253 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 20:10:12 crc kubenswrapper[4885]: I1205 20:10:12.472455 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 20:10:12 crc kubenswrapper[4885]: I1205 20:10:12.557225 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 20:10:12 crc kubenswrapper[4885]: I1205 20:10:12.713092 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 20:10:12 crc kubenswrapper[4885]: I1205 20:10:12.853710 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 20:10:12 crc kubenswrapper[4885]: I1205 20:10:12.868948 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 20:10:12 crc kubenswrapper[4885]: I1205 20:10:12.942464 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 20:10:12 crc kubenswrapper[4885]: I1205 20:10:12.982884 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.012092 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.109488 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.225748 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.233687 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.252846 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.289688 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.290444 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.336757 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.337869 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.352221 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.358955 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.448396 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.536662 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.561862 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.574853 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.577147 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.584739 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.631682 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.728170 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.761358 4885 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.860612 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.958210 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 20:10:13 crc kubenswrapper[4885]: I1205 20:10:13.994976 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 20:10:14 crc kubenswrapper[4885]: I1205 20:10:14.027450 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 20:10:14 crc kubenswrapper[4885]: I1205 20:10:14.040535 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 20:10:14 crc kubenswrapper[4885]: I1205 20:10:14.147002 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 20:10:14 crc kubenswrapper[4885]: I1205 20:10:14.174664 4885 scope.go:117] "RemoveContainer" containerID="fcf75bbdaed56f54b635f11072f5f368909797bcb19b743b62b783b7d7e3d17c" Dec 05 20:10:14 crc kubenswrapper[4885]: E1205 20:10:14.175049 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-5494594499-nfq79_openshift-authentication(3d6a03bc-8326-43ba-8748-1a438eddde7d)\"" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" Dec 05 20:10:14 crc kubenswrapper[4885]: I1205 20:10:14.187751 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 20:10:14 crc kubenswrapper[4885]: I1205 20:10:14.356667 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 20:10:14 crc kubenswrapper[4885]: I1205 20:10:14.389072 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 20:10:14 crc kubenswrapper[4885]: I1205 20:10:14.397887 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 20:10:14 crc kubenswrapper[4885]: I1205 20:10:14.738796 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 20:10:14 crc kubenswrapper[4885]: I1205 20:10:14.816534 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 20:10:15 crc kubenswrapper[4885]: I1205 20:10:15.186613 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 20:10:15 crc kubenswrapper[4885]: I1205 20:10:15.242576 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 20:10:15 crc kubenswrapper[4885]: I1205 20:10:15.273229 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 20:10:15 crc kubenswrapper[4885]: I1205 20:10:15.407927 4885 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 20:10:15 crc kubenswrapper[4885]: I1205 20:10:15.408977 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 20:10:15 crc kubenswrapper[4885]: I1205 20:10:15.518605 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 20:10:15 crc kubenswrapper[4885]: I1205 20:10:15.531191 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 20:10:15 crc kubenswrapper[4885]: I1205 20:10:15.577318 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 20:10:15 crc kubenswrapper[4885]: I1205 20:10:15.638180 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 20:10:15 crc kubenswrapper[4885]: I1205 20:10:15.645426 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 20:10:15 crc kubenswrapper[4885]: I1205 20:10:15.706781 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 20:10:15 crc kubenswrapper[4885]: I1205 20:10:15.830286 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 20:10:15 crc kubenswrapper[4885]: I1205 20:10:15.888808 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 20:10:15 crc kubenswrapper[4885]: I1205 20:10:15.925684 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 20:10:16 crc kubenswrapper[4885]: I1205 20:10:16.028291 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 20:10:16 crc kubenswrapper[4885]: I1205 20:10:16.136100 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 20:10:16 crc kubenswrapper[4885]: I1205 20:10:16.157408 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 20:10:16 crc kubenswrapper[4885]: I1205 20:10:16.278609 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 20:10:16 crc kubenswrapper[4885]: I1205 20:10:16.388146 4885 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 20:10:16 crc kubenswrapper[4885]: I1205 20:10:16.396567 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:10:16 crc kubenswrapper[4885]: I1205 20:10:16.396656 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:10:16 crc kubenswrapper[4885]: I1205 20:10:16.397275 4885 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e003c8d-46a7-4194-b63b-100b1d5af08e" Dec 05 20:10:16 crc kubenswrapper[4885]: I1205 20:10:16.397327 4885 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e003c8d-46a7-4194-b63b-100b1d5af08e" Dec 05 20:10:16 crc kubenswrapper[4885]: I1205 20:10:16.405231 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:16 crc kubenswrapper[4885]: I1205 20:10:16.429751 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.429725168 podStartE2EDuration="24.429725168s" podCreationTimestamp="2025-12-05 20:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:10:16.419371154 +0000 UTC m=+281.716186845" watchObservedRunningTime="2025-12-05 20:10:16.429725168 +0000 UTC m=+281.726540859" Dec 05 20:10:16 crc kubenswrapper[4885]: I1205 20:10:16.512359 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 20:10:16 crc kubenswrapper[4885]: I1205 20:10:16.693435 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 20:10:16 crc kubenswrapper[4885]: I1205 20:10:16.823326 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 20:10:16 crc kubenswrapper[4885]: I1205 20:10:16.941938 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 20:10:17 crc kubenswrapper[4885]: I1205 20:10:17.409052 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 20:10:25 crc kubenswrapper[4885]: I1205 20:10:25.737156 4885 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 20:10:25 crc kubenswrapper[4885]: I1205 20:10:25.738412 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://025a4d3862b2df814c65a70f11b813bdeb5369fe2b69f86e339a07f8bdd1ab20" gracePeriod=5 Dec 05 20:10:27 crc kubenswrapper[4885]: I1205 20:10:27.026346 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 20:10:27 crc kubenswrapper[4885]: I1205 20:10:27.699915 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 20:10:29 crc kubenswrapper[4885]: I1205 20:10:29.173797 4885 scope.go:117] "RemoveContainer" containerID="fcf75bbdaed56f54b635f11072f5f368909797bcb19b743b62b783b7d7e3d17c" Dec 05 20:10:29 crc kubenswrapper[4885]: I1205 20:10:29.539714 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5494594499-nfq79_3d6a03bc-8326-43ba-8748-1a438eddde7d/oauth-openshift/2.log" Dec 05 20:10:29 crc kubenswrapper[4885]: I1205 20:10:29.540264 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" event={"ID":"3d6a03bc-8326-43ba-8748-1a438eddde7d","Type":"ContainerStarted","Data":"db2821734937401227b699b892329b651c889f5f2c54b19dbae3aa09e6ab2e45"} Dec 05 20:10:29 crc kubenswrapper[4885]: I1205 20:10:29.541501 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:10:29 crc kubenswrapper[4885]: I1205 20:10:29.542610 4885 patch_prober.go:28] interesting pod/oauth-openshift-5494594499-nfq79 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" start-of-body= Dec 05 20:10:29 crc kubenswrapper[4885]: I1205 20:10:29.542674 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" podUID="3d6a03bc-8326-43ba-8748-1a438eddde7d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" Dec 05 20:10:29 crc kubenswrapper[4885]: I1205 20:10:29.559637 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" podStartSLOduration=89.559602466 podStartE2EDuration="1m29.559602466s" podCreationTimestamp="2025-12-05 20:09:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:10:29.558568933 +0000 UTC m=+294.855384594" watchObservedRunningTime="2025-12-05 20:10:29.559602466 +0000 UTC m=+294.856418127" Dec 05 20:10:30 crc kubenswrapper[4885]: I1205 20:10:30.551687 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5494594499-nfq79" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.340899 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.341009 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.438533 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.438719 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.438763 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.438802 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.438845 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.438847 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.438955 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.439446 4885 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.439479 4885 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.439539 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.439636 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.451913 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.541126 4885 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.541161 4885 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.541173 4885 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.542942 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.553298 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.553384 4885 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="025a4d3862b2df814c65a70f11b813bdeb5369fe2b69f86e339a07f8bdd1ab20" exitCode=137 Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.553497 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.553542 4885 scope.go:117] "RemoveContainer" containerID="025a4d3862b2df814c65a70f11b813bdeb5369fe2b69f86e339a07f8bdd1ab20" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.585735 4885 scope.go:117] "RemoveContainer" containerID="025a4d3862b2df814c65a70f11b813bdeb5369fe2b69f86e339a07f8bdd1ab20" Dec 05 20:10:31 crc kubenswrapper[4885]: E1205 20:10:31.586318 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"025a4d3862b2df814c65a70f11b813bdeb5369fe2b69f86e339a07f8bdd1ab20\": container with ID starting with 025a4d3862b2df814c65a70f11b813bdeb5369fe2b69f86e339a07f8bdd1ab20 not found: ID does not exist" containerID="025a4d3862b2df814c65a70f11b813bdeb5369fe2b69f86e339a07f8bdd1ab20" Dec 05 20:10:31 crc kubenswrapper[4885]: I1205 20:10:31.586370 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"025a4d3862b2df814c65a70f11b813bdeb5369fe2b69f86e339a07f8bdd1ab20"} err="failed to get container status \"025a4d3862b2df814c65a70f11b813bdeb5369fe2b69f86e339a07f8bdd1ab20\": rpc error: code = NotFound desc = could not find container \"025a4d3862b2df814c65a70f11b813bdeb5369fe2b69f86e339a07f8bdd1ab20\": container with ID starting with 025a4d3862b2df814c65a70f11b813bdeb5369fe2b69f86e339a07f8bdd1ab20 not found: ID does not exist" Dec 05 20:10:33 crc kubenswrapper[4885]: I1205 20:10:33.183545 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 05 20:10:36 crc kubenswrapper[4885]: I1205 20:10:36.588594 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 05 20:10:36 crc kubenswrapper[4885]: I1205 20:10:36.590540 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 20:10:36 crc kubenswrapper[4885]: I1205 20:10:36.590597 4885 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="81544f0d363a8d8395e553f517fd04611bd8d0c00f523a8d055c402f9b2d1cb2" exitCode=137 Dec 05 20:10:36 crc kubenswrapper[4885]: I1205 20:10:36.590632 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"81544f0d363a8d8395e553f517fd04611bd8d0c00f523a8d055c402f9b2d1cb2"} Dec 05 20:10:36 crc kubenswrapper[4885]: I1205 20:10:36.590667 4885 scope.go:117] "RemoveContainer" containerID="76928a3ac39af8c003c960c2f1f330945acf4d4880f1530c4a41d0af5ff8c348" Dec 05 20:10:37 crc kubenswrapper[4885]: I1205 20:10:37.598607 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 05 20:10:37 crc kubenswrapper[4885]: I1205 20:10:37.600166 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5b8f20dba2b53fb695acf35988bb2e37cdf6b1861816316c60319f7e4e7b7243"} Dec 05 20:10:38 crc kubenswrapper[4885]: I1205 20:10:38.556430 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 20:10:38 crc kubenswrapper[4885]: I1205 20:10:38.924338 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 20:10:39 crc kubenswrapper[4885]: I1205 20:10:39.613096 4885 generic.go:334] "Generic (PLEG): container finished" podID="106ffd61-239f-4707-b999-aa044f6f30ae" containerID="8e1392383c19bfc5439cf6a03b16f4e7128a7e48f79ec146434f29359e401e0f" exitCode=0 Dec 05 20:10:39 crc kubenswrapper[4885]: I1205 20:10:39.613140 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" event={"ID":"106ffd61-239f-4707-b999-aa044f6f30ae","Type":"ContainerDied","Data":"8e1392383c19bfc5439cf6a03b16f4e7128a7e48f79ec146434f29359e401e0f"} Dec 05 20:10:39 crc kubenswrapper[4885]: I1205 20:10:39.613565 4885 scope.go:117] "RemoveContainer" containerID="8e1392383c19bfc5439cf6a03b16f4e7128a7e48f79ec146434f29359e401e0f" Dec 05 20:10:40 crc kubenswrapper[4885]: I1205 20:10:40.119114 4885 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 20:10:40 crc kubenswrapper[4885]: I1205 20:10:40.618915 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" event={"ID":"106ffd61-239f-4707-b999-aa044f6f30ae","Type":"ContainerStarted","Data":"efadd20e9f956c6cca8f25167f14ab63bfcd7f50f8c4d5b6d6a10248e0b3f634"} Dec 05 20:10:40 crc kubenswrapper[4885]: I1205 20:10:40.619343 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:10:40 crc kubenswrapper[4885]: I1205 20:10:40.621132 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:10:41 crc kubenswrapper[4885]: I1205 20:10:41.409939 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 20:10:43 crc kubenswrapper[4885]: I1205 20:10:43.056465 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 20:10:44 crc kubenswrapper[4885]: I1205 20:10:44.508595 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:45 crc kubenswrapper[4885]: I1205 20:10:45.019299 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 20:10:46 crc kubenswrapper[4885]: I1205 20:10:46.325642 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:46 crc kubenswrapper[4885]: I1205 20:10:46.330160 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:47 crc kubenswrapper[4885]: I1205 20:10:47.429877 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 20:10:48 crc kubenswrapper[4885]: I1205 20:10:48.300472 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 20:10:48 crc kubenswrapper[4885]: I1205 20:10:48.586675 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 20:10:49 crc kubenswrapper[4885]: I1205 20:10:49.099183 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 20:10:49 crc kubenswrapper[4885]: I1205 20:10:49.602839 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 20:10:54 crc kubenswrapper[4885]: I1205 20:10:54.123323 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 20:10:54 crc kubenswrapper[4885]: I1205 20:10:54.515090 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:55 crc kubenswrapper[4885]: I1205 20:10:55.082505 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 20:10:56 crc kubenswrapper[4885]: I1205 20:10:56.491129 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 20:11:12 crc kubenswrapper[4885]: I1205 20:11:12.759865 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659"] Dec 05 20:11:12 crc kubenswrapper[4885]: I1205 20:11:12.760506 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" podUID="34f5add1-5763-4b13-8058-e1b6fbbb4740" containerName="route-controller-manager" containerID="cri-o://0a928fc1fc3a9789077f28ecd7ad4138d9276c48b718c5026a9df005e4a98913" gracePeriod=30 Dec 05 20:11:12 crc kubenswrapper[4885]: I1205 20:11:12.774977 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xqrhh"] Dec 05 20:11:12 crc kubenswrapper[4885]: I1205 20:11:12.775522 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" podUID="c9e46b72-f528-4f07-8b1e-96b98302ac86" containerName="controller-manager" containerID="cri-o://df42832dde0166224b318fc872ebc3d57ae7022ed6e6e4b4f34e7ccc3f0dfab9" gracePeriod=30 Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.120979 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.127080 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.265797 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34f5add1-5763-4b13-8058-e1b6fbbb4740-serving-cert\") pod \"34f5add1-5763-4b13-8058-e1b6fbbb4740\" (UID: \"34f5add1-5763-4b13-8058-e1b6fbbb4740\") " Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.265841 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-proxy-ca-bundles\") pod \"c9e46b72-f528-4f07-8b1e-96b98302ac86\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.265886 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-config\") pod \"c9e46b72-f528-4f07-8b1e-96b98302ac86\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.265935 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67bs2\" (UniqueName: \"kubernetes.io/projected/34f5add1-5763-4b13-8058-e1b6fbbb4740-kube-api-access-67bs2\") pod \"34f5add1-5763-4b13-8058-e1b6fbbb4740\" (UID: \"34f5add1-5763-4b13-8058-e1b6fbbb4740\") " Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.265955 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f5add1-5763-4b13-8058-e1b6fbbb4740-config\") pod \"34f5add1-5763-4b13-8058-e1b6fbbb4740\" (UID: \"34f5add1-5763-4b13-8058-e1b6fbbb4740\") " Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.265971 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-client-ca\") pod \"c9e46b72-f528-4f07-8b1e-96b98302ac86\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.265990 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34f5add1-5763-4b13-8058-e1b6fbbb4740-client-ca\") pod \"34f5add1-5763-4b13-8058-e1b6fbbb4740\" (UID: \"34f5add1-5763-4b13-8058-e1b6fbbb4740\") " Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.266011 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hclpm\" (UniqueName: \"kubernetes.io/projected/c9e46b72-f528-4f07-8b1e-96b98302ac86-kube-api-access-hclpm\") pod \"c9e46b72-f528-4f07-8b1e-96b98302ac86\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.266070 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9e46b72-f528-4f07-8b1e-96b98302ac86-serving-cert\") pod \"c9e46b72-f528-4f07-8b1e-96b98302ac86\" (UID: \"c9e46b72-f528-4f07-8b1e-96b98302ac86\") " Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.267310 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34f5add1-5763-4b13-8058-e1b6fbbb4740-client-ca" (OuterVolumeSpecName: "client-ca") pod "34f5add1-5763-4b13-8058-e1b6fbbb4740" (UID: "34f5add1-5763-4b13-8058-e1b6fbbb4740"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.267690 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-config" (OuterVolumeSpecName: "config") pod "c9e46b72-f528-4f07-8b1e-96b98302ac86" (UID: "c9e46b72-f528-4f07-8b1e-96b98302ac86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.268363 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c9e46b72-f528-4f07-8b1e-96b98302ac86" (UID: "c9e46b72-f528-4f07-8b1e-96b98302ac86"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.268377 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-client-ca" (OuterVolumeSpecName: "client-ca") pod "c9e46b72-f528-4f07-8b1e-96b98302ac86" (UID: "c9e46b72-f528-4f07-8b1e-96b98302ac86"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.268638 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34f5add1-5763-4b13-8058-e1b6fbbb4740-config" (OuterVolumeSpecName: "config") pod "34f5add1-5763-4b13-8058-e1b6fbbb4740" (UID: "34f5add1-5763-4b13-8058-e1b6fbbb4740"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.273235 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e46b72-f528-4f07-8b1e-96b98302ac86-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c9e46b72-f528-4f07-8b1e-96b98302ac86" (UID: "c9e46b72-f528-4f07-8b1e-96b98302ac86"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.273256 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e46b72-f528-4f07-8b1e-96b98302ac86-kube-api-access-hclpm" (OuterVolumeSpecName: "kube-api-access-hclpm") pod "c9e46b72-f528-4f07-8b1e-96b98302ac86" (UID: "c9e46b72-f528-4f07-8b1e-96b98302ac86"). InnerVolumeSpecName "kube-api-access-hclpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.273289 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34f5add1-5763-4b13-8058-e1b6fbbb4740-kube-api-access-67bs2" (OuterVolumeSpecName: "kube-api-access-67bs2") pod "34f5add1-5763-4b13-8058-e1b6fbbb4740" (UID: "34f5add1-5763-4b13-8058-e1b6fbbb4740"). InnerVolumeSpecName "kube-api-access-67bs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.273656 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34f5add1-5763-4b13-8058-e1b6fbbb4740-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "34f5add1-5763-4b13-8058-e1b6fbbb4740" (UID: "34f5add1-5763-4b13-8058-e1b6fbbb4740"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.367760 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67bs2\" (UniqueName: \"kubernetes.io/projected/34f5add1-5763-4b13-8058-e1b6fbbb4740-kube-api-access-67bs2\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.367803 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f5add1-5763-4b13-8058-e1b6fbbb4740-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.367823 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.367842 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34f5add1-5763-4b13-8058-e1b6fbbb4740-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.367859 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hclpm\" (UniqueName: \"kubernetes.io/projected/c9e46b72-f528-4f07-8b1e-96b98302ac86-kube-api-access-hclpm\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.367875 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9e46b72-f528-4f07-8b1e-96b98302ac86-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.367892 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34f5add1-5763-4b13-8058-e1b6fbbb4740-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.367909 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.367924 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9e46b72-f528-4f07-8b1e-96b98302ac86-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.822944 4885 generic.go:334] "Generic (PLEG): container finished" podID="34f5add1-5763-4b13-8058-e1b6fbbb4740" containerID="0a928fc1fc3a9789077f28ecd7ad4138d9276c48b718c5026a9df005e4a98913" exitCode=0 Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.823071 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" event={"ID":"34f5add1-5763-4b13-8058-e1b6fbbb4740","Type":"ContainerDied","Data":"0a928fc1fc3a9789077f28ecd7ad4138d9276c48b718c5026a9df005e4a98913"} Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.823083 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.823129 4885 scope.go:117] "RemoveContainer" containerID="0a928fc1fc3a9789077f28ecd7ad4138d9276c48b718c5026a9df005e4a98913" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.823112 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659" event={"ID":"34f5add1-5763-4b13-8058-e1b6fbbb4740","Type":"ContainerDied","Data":"d61f4330cc806d9407145dfa60d08efd1de135fea6f12049312ddcf3250e0675"} Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.825769 4885 generic.go:334] "Generic (PLEG): container finished" podID="c9e46b72-f528-4f07-8b1e-96b98302ac86" containerID="df42832dde0166224b318fc872ebc3d57ae7022ed6e6e4b4f34e7ccc3f0dfab9" exitCode=0 Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.825806 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" event={"ID":"c9e46b72-f528-4f07-8b1e-96b98302ac86","Type":"ContainerDied","Data":"df42832dde0166224b318fc872ebc3d57ae7022ed6e6e4b4f34e7ccc3f0dfab9"} Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.825835 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" event={"ID":"c9e46b72-f528-4f07-8b1e-96b98302ac86","Type":"ContainerDied","Data":"56375deacdad25513422b9c51bab324fb3d7e97f288385e871f5a764e6acdc4e"} Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.825879 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xqrhh" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.849330 4885 scope.go:117] "RemoveContainer" containerID="0a928fc1fc3a9789077f28ecd7ad4138d9276c48b718c5026a9df005e4a98913" Dec 05 20:11:13 crc kubenswrapper[4885]: E1205 20:11:13.856948 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a928fc1fc3a9789077f28ecd7ad4138d9276c48b718c5026a9df005e4a98913\": container with ID starting with 0a928fc1fc3a9789077f28ecd7ad4138d9276c48b718c5026a9df005e4a98913 not found: ID does not exist" containerID="0a928fc1fc3a9789077f28ecd7ad4138d9276c48b718c5026a9df005e4a98913" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.857096 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a928fc1fc3a9789077f28ecd7ad4138d9276c48b718c5026a9df005e4a98913"} err="failed to get container status \"0a928fc1fc3a9789077f28ecd7ad4138d9276c48b718c5026a9df005e4a98913\": rpc error: code = NotFound desc = could not find container \"0a928fc1fc3a9789077f28ecd7ad4138d9276c48b718c5026a9df005e4a98913\": container with ID starting with 0a928fc1fc3a9789077f28ecd7ad4138d9276c48b718c5026a9df005e4a98913 not found: ID does not exist" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.857161 4885 scope.go:117] "RemoveContainer" containerID="df42832dde0166224b318fc872ebc3d57ae7022ed6e6e4b4f34e7ccc3f0dfab9" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.898257 4885 scope.go:117] "RemoveContainer" containerID="df42832dde0166224b318fc872ebc3d57ae7022ed6e6e4b4f34e7ccc3f0dfab9" Dec 05 20:11:13 crc kubenswrapper[4885]: E1205 20:11:13.901147 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df42832dde0166224b318fc872ebc3d57ae7022ed6e6e4b4f34e7ccc3f0dfab9\": container with ID starting with df42832dde0166224b318fc872ebc3d57ae7022ed6e6e4b4f34e7ccc3f0dfab9 not found: ID does not exist" containerID="df42832dde0166224b318fc872ebc3d57ae7022ed6e6e4b4f34e7ccc3f0dfab9" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.901199 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df42832dde0166224b318fc872ebc3d57ae7022ed6e6e4b4f34e7ccc3f0dfab9"} err="failed to get container status \"df42832dde0166224b318fc872ebc3d57ae7022ed6e6e4b4f34e7ccc3f0dfab9\": rpc error: code = NotFound desc = could not find container \"df42832dde0166224b318fc872ebc3d57ae7022ed6e6e4b4f34e7ccc3f0dfab9\": container with ID starting with df42832dde0166224b318fc872ebc3d57ae7022ed6e6e4b4f34e7ccc3f0dfab9 not found: ID does not exist" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.902138 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-58975fd7b8-f2hfk"] Dec 05 20:11:13 crc kubenswrapper[4885]: E1205 20:11:13.902417 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" containerName="installer" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.902439 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" containerName="installer" Dec 05 20:11:13 crc kubenswrapper[4885]: E1205 20:11:13.902453 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e46b72-f528-4f07-8b1e-96b98302ac86" containerName="controller-manager" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.902463 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e46b72-f528-4f07-8b1e-96b98302ac86" containerName="controller-manager" Dec 05 20:11:13 crc kubenswrapper[4885]: E1205 20:11:13.902476 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f5add1-5763-4b13-8058-e1b6fbbb4740" containerName="route-controller-manager" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.902493 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f5add1-5763-4b13-8058-e1b6fbbb4740" containerName="route-controller-manager" Dec 05 20:11:13 crc kubenswrapper[4885]: E1205 20:11:13.902530 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.902543 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.902744 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.902770 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e46b72-f528-4f07-8b1e-96b98302ac86" containerName="controller-manager" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.902784 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="34f5add1-5763-4b13-8058-e1b6fbbb4740" containerName="route-controller-manager" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.902798 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa54f954-f05a-44b2-8f26-4a9990d44845" containerName="installer" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.903315 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.909303 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.909734 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.910358 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xqrhh"] Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.911106 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.911270 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.919514 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.920211 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.928069 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xqrhh"] Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.928381 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.932369 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58975fd7b8-f2hfk"] Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.942840 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq"] Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.943600 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.948343 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.948767 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.949009 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.949289 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.949519 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.949774 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.963215 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq"] Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.973933 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659"] Dec 05 20:11:13 crc kubenswrapper[4885]: I1205 20:11:13.987607 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sp659"] Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.077746 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c526c9db-5b46-4c19-be12-1b59f7f622dc-serving-cert\") pod \"controller-manager-58975fd7b8-f2hfk\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.077804 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e19a512d-e436-45b4-9701-0121028e4826-config\") pod \"route-controller-manager-6495bf885b-p69qq\" (UID: \"e19a512d-e436-45b4-9701-0121028e4826\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.077830 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw2rl\" (UniqueName: \"kubernetes.io/projected/c526c9db-5b46-4c19-be12-1b59f7f622dc-kube-api-access-xw2rl\") pod \"controller-manager-58975fd7b8-f2hfk\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.077910 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq569\" (UniqueName: \"kubernetes.io/projected/e19a512d-e436-45b4-9701-0121028e4826-kube-api-access-fq569\") pod \"route-controller-manager-6495bf885b-p69qq\" (UID: \"e19a512d-e436-45b4-9701-0121028e4826\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.077934 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-client-ca\") pod \"controller-manager-58975fd7b8-f2hfk\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.077953 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e19a512d-e436-45b4-9701-0121028e4826-serving-cert\") pod \"route-controller-manager-6495bf885b-p69qq\" (UID: \"e19a512d-e436-45b4-9701-0121028e4826\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.077984 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-config\") pod \"controller-manager-58975fd7b8-f2hfk\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.078002 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-proxy-ca-bundles\") pod \"controller-manager-58975fd7b8-f2hfk\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.078059 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e19a512d-e436-45b4-9701-0121028e4826-client-ca\") pod \"route-controller-manager-6495bf885b-p69qq\" (UID: \"e19a512d-e436-45b4-9701-0121028e4826\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.178855 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c526c9db-5b46-4c19-be12-1b59f7f622dc-serving-cert\") pod \"controller-manager-58975fd7b8-f2hfk\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.178925 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e19a512d-e436-45b4-9701-0121028e4826-config\") pod \"route-controller-manager-6495bf885b-p69qq\" (UID: \"e19a512d-e436-45b4-9701-0121028e4826\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.178953 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw2rl\" (UniqueName: \"kubernetes.io/projected/c526c9db-5b46-4c19-be12-1b59f7f622dc-kube-api-access-xw2rl\") pod \"controller-manager-58975fd7b8-f2hfk\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.178983 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq569\" (UniqueName: \"kubernetes.io/projected/e19a512d-e436-45b4-9701-0121028e4826-kube-api-access-fq569\") pod \"route-controller-manager-6495bf885b-p69qq\" (UID: \"e19a512d-e436-45b4-9701-0121028e4826\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.179057 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-client-ca\") pod \"controller-manager-58975fd7b8-f2hfk\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.179090 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e19a512d-e436-45b4-9701-0121028e4826-serving-cert\") pod \"route-controller-manager-6495bf885b-p69qq\" (UID: \"e19a512d-e436-45b4-9701-0121028e4826\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.179117 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-config\") pod \"controller-manager-58975fd7b8-f2hfk\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.179134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-proxy-ca-bundles\") pod \"controller-manager-58975fd7b8-f2hfk\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.179169 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e19a512d-e436-45b4-9701-0121028e4826-client-ca\") pod \"route-controller-manager-6495bf885b-p69qq\" (UID: \"e19a512d-e436-45b4-9701-0121028e4826\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.180408 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-client-ca\") pod \"controller-manager-58975fd7b8-f2hfk\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.181056 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-config\") pod \"controller-manager-58975fd7b8-f2hfk\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.181063 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-proxy-ca-bundles\") pod \"controller-manager-58975fd7b8-f2hfk\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.181731 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e19a512d-e436-45b4-9701-0121028e4826-client-ca\") pod \"route-controller-manager-6495bf885b-p69qq\" (UID: \"e19a512d-e436-45b4-9701-0121028e4826\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.181999 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e19a512d-e436-45b4-9701-0121028e4826-config\") pod \"route-controller-manager-6495bf885b-p69qq\" (UID: \"e19a512d-e436-45b4-9701-0121028e4826\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.183089 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c526c9db-5b46-4c19-be12-1b59f7f622dc-serving-cert\") pod \"controller-manager-58975fd7b8-f2hfk\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.185010 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e19a512d-e436-45b4-9701-0121028e4826-serving-cert\") pod \"route-controller-manager-6495bf885b-p69qq\" (UID: \"e19a512d-e436-45b4-9701-0121028e4826\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.196736 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq569\" (UniqueName: \"kubernetes.io/projected/e19a512d-e436-45b4-9701-0121028e4826-kube-api-access-fq569\") pod \"route-controller-manager-6495bf885b-p69qq\" (UID: \"e19a512d-e436-45b4-9701-0121028e4826\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.213006 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw2rl\" (UniqueName: \"kubernetes.io/projected/c526c9db-5b46-4c19-be12-1b59f7f622dc-kube-api-access-xw2rl\") pod \"controller-manager-58975fd7b8-f2hfk\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.243832 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.264659 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.475856 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58975fd7b8-f2hfk"] Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.515729 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq"] Dec 05 20:11:14 crc kubenswrapper[4885]: W1205 20:11:14.517397 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode19a512d_e436_45b4_9701_0121028e4826.slice/crio-cd51063b9287659f49b46d9387deda2e4a842a38aeeefd41c07b53ac04c09bd2 WatchSource:0}: Error finding container cd51063b9287659f49b46d9387deda2e4a842a38aeeefd41c07b53ac04c09bd2: Status 404 returned error can't find the container with id cd51063b9287659f49b46d9387deda2e4a842a38aeeefd41c07b53ac04c09bd2 Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.832585 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" event={"ID":"e19a512d-e436-45b4-9701-0121028e4826","Type":"ContainerStarted","Data":"f86e1af98fc7e7b3977eb93386d39f35103a4988f47b17edf673b06c4892fca7"} Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.832907 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" event={"ID":"e19a512d-e436-45b4-9701-0121028e4826","Type":"ContainerStarted","Data":"cd51063b9287659f49b46d9387deda2e4a842a38aeeefd41c07b53ac04c09bd2"} Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.832923 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.834805 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" event={"ID":"c526c9db-5b46-4c19-be12-1b59f7f622dc","Type":"ContainerStarted","Data":"b2324af3bb9eb06af777ba4865aea34c3a34bb7a798dd15db46941af0af0a15d"} Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.834851 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" event={"ID":"c526c9db-5b46-4c19-be12-1b59f7f622dc","Type":"ContainerStarted","Data":"7f45fa8dfef908f246059747d32e0f5cd26ab6e3a4ea4beb783f4e11bbf71bdc"} Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.835011 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.842937 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:14 crc kubenswrapper[4885]: I1205 20:11:14.848642 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" podStartSLOduration=1.848625807 podStartE2EDuration="1.848625807s" podCreationTimestamp="2025-12-05 20:11:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:11:14.847292978 +0000 UTC m=+340.144108649" watchObservedRunningTime="2025-12-05 20:11:14.848625807 +0000 UTC m=+340.145441468" Dec 05 20:11:15 crc kubenswrapper[4885]: I1205 20:11:15.056485 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:15 crc kubenswrapper[4885]: I1205 20:11:15.079720 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" podStartSLOduration=2.079695424 podStartE2EDuration="2.079695424s" podCreationTimestamp="2025-12-05 20:11:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:11:14.864894702 +0000 UTC m=+340.161710373" watchObservedRunningTime="2025-12-05 20:11:15.079695424 +0000 UTC m=+340.376511105" Dec 05 20:11:15 crc kubenswrapper[4885]: I1205 20:11:15.179411 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34f5add1-5763-4b13-8058-e1b6fbbb4740" path="/var/lib/kubelet/pods/34f5add1-5763-4b13-8058-e1b6fbbb4740/volumes" Dec 05 20:11:15 crc kubenswrapper[4885]: I1205 20:11:15.180248 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e46b72-f528-4f07-8b1e-96b98302ac86" path="/var/lib/kubelet/pods/c9e46b72-f528-4f07-8b1e-96b98302ac86/volumes" Dec 05 20:11:15 crc kubenswrapper[4885]: I1205 20:11:15.606385 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58975fd7b8-f2hfk"] Dec 05 20:11:15 crc kubenswrapper[4885]: I1205 20:11:15.622634 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq"] Dec 05 20:11:16 crc kubenswrapper[4885]: I1205 20:11:16.844063 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" podUID="e19a512d-e436-45b4-9701-0121028e4826" containerName="route-controller-manager" containerID="cri-o://f86e1af98fc7e7b3977eb93386d39f35103a4988f47b17edf673b06c4892fca7" gracePeriod=30 Dec 05 20:11:16 crc kubenswrapper[4885]: I1205 20:11:16.844257 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" podUID="c526c9db-5b46-4c19-be12-1b59f7f622dc" containerName="controller-manager" containerID="cri-o://b2324af3bb9eb06af777ba4865aea34c3a34bb7a798dd15db46941af0af0a15d" gracePeriod=30 Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.265775 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.272682 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.291936 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9b9b64d5f-gkln6"] Dec 05 20:11:17 crc kubenswrapper[4885]: E1205 20:11:17.293569 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19a512d-e436-45b4-9701-0121028e4826" containerName="route-controller-manager" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.293596 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19a512d-e436-45b4-9701-0121028e4826" containerName="route-controller-manager" Dec 05 20:11:17 crc kubenswrapper[4885]: E1205 20:11:17.293610 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c526c9db-5b46-4c19-be12-1b59f7f622dc" containerName="controller-manager" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.293617 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c526c9db-5b46-4c19-be12-1b59f7f622dc" containerName="controller-manager" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.293730 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c526c9db-5b46-4c19-be12-1b59f7f622dc" containerName="controller-manager" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.293743 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e19a512d-e436-45b4-9701-0121028e4826" containerName="route-controller-manager" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.294200 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.312010 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9b9b64d5f-gkln6"] Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.425453 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-config\") pod \"c526c9db-5b46-4c19-be12-1b59f7f622dc\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.425528 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e19a512d-e436-45b4-9701-0121028e4826-serving-cert\") pod \"e19a512d-e436-45b4-9701-0121028e4826\" (UID: \"e19a512d-e436-45b4-9701-0121028e4826\") " Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.425554 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq569\" (UniqueName: \"kubernetes.io/projected/e19a512d-e436-45b4-9701-0121028e4826-kube-api-access-fq569\") pod \"e19a512d-e436-45b4-9701-0121028e4826\" (UID: \"e19a512d-e436-45b4-9701-0121028e4826\") " Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.425572 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-proxy-ca-bundles\") pod \"c526c9db-5b46-4c19-be12-1b59f7f622dc\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.425611 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c526c9db-5b46-4c19-be12-1b59f7f622dc-serving-cert\") pod \"c526c9db-5b46-4c19-be12-1b59f7f622dc\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.425631 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e19a512d-e436-45b4-9701-0121028e4826-config\") pod \"e19a512d-e436-45b4-9701-0121028e4826\" (UID: \"e19a512d-e436-45b4-9701-0121028e4826\") " Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.426105 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c526c9db-5b46-4c19-be12-1b59f7f622dc" (UID: "c526c9db-5b46-4c19-be12-1b59f7f622dc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.426394 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-config" (OuterVolumeSpecName: "config") pod "c526c9db-5b46-4c19-be12-1b59f7f622dc" (UID: "c526c9db-5b46-4c19-be12-1b59f7f622dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.426654 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw2rl\" (UniqueName: \"kubernetes.io/projected/c526c9db-5b46-4c19-be12-1b59f7f622dc-kube-api-access-xw2rl\") pod \"c526c9db-5b46-4c19-be12-1b59f7f622dc\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.426712 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e19a512d-e436-45b4-9701-0121028e4826-client-ca\") pod \"e19a512d-e436-45b4-9701-0121028e4826\" (UID: \"e19a512d-e436-45b4-9701-0121028e4826\") " Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.426738 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-client-ca\") pod \"c526c9db-5b46-4c19-be12-1b59f7f622dc\" (UID: \"c526c9db-5b46-4c19-be12-1b59f7f622dc\") " Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.426911 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prwnx\" (UniqueName: \"kubernetes.io/projected/4a6773a5-7f14-468e-b42c-b0b35a92137c-kube-api-access-prwnx\") pod \"controller-manager-9b9b64d5f-gkln6\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.426956 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-config\") pod \"controller-manager-9b9b64d5f-gkln6\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.427004 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-client-ca\") pod \"controller-manager-9b9b64d5f-gkln6\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.427046 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-proxy-ca-bundles\") pod \"controller-manager-9b9b64d5f-gkln6\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.427071 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a6773a5-7f14-468e-b42c-b0b35a92137c-serving-cert\") pod \"controller-manager-9b9b64d5f-gkln6\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.427130 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.427144 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.427361 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19a512d-e436-45b4-9701-0121028e4826-config" (OuterVolumeSpecName: "config") pod "e19a512d-e436-45b4-9701-0121028e4826" (UID: "e19a512d-e436-45b4-9701-0121028e4826"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.427634 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19a512d-e436-45b4-9701-0121028e4826-client-ca" (OuterVolumeSpecName: "client-ca") pod "e19a512d-e436-45b4-9701-0121028e4826" (UID: "e19a512d-e436-45b4-9701-0121028e4826"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.427803 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-client-ca" (OuterVolumeSpecName: "client-ca") pod "c526c9db-5b46-4c19-be12-1b59f7f622dc" (UID: "c526c9db-5b46-4c19-be12-1b59f7f622dc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.430742 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e19a512d-e436-45b4-9701-0121028e4826-kube-api-access-fq569" (OuterVolumeSpecName: "kube-api-access-fq569") pod "e19a512d-e436-45b4-9701-0121028e4826" (UID: "e19a512d-e436-45b4-9701-0121028e4826"). InnerVolumeSpecName "kube-api-access-fq569". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.430725 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c526c9db-5b46-4c19-be12-1b59f7f622dc-kube-api-access-xw2rl" (OuterVolumeSpecName: "kube-api-access-xw2rl") pod "c526c9db-5b46-4c19-be12-1b59f7f622dc" (UID: "c526c9db-5b46-4c19-be12-1b59f7f622dc"). InnerVolumeSpecName "kube-api-access-xw2rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.430827 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c526c9db-5b46-4c19-be12-1b59f7f622dc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c526c9db-5b46-4c19-be12-1b59f7f622dc" (UID: "c526c9db-5b46-4c19-be12-1b59f7f622dc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.432243 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e19a512d-e436-45b4-9701-0121028e4826-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e19a512d-e436-45b4-9701-0121028e4826" (UID: "e19a512d-e436-45b4-9701-0121028e4826"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.528856 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prwnx\" (UniqueName: \"kubernetes.io/projected/4a6773a5-7f14-468e-b42c-b0b35a92137c-kube-api-access-prwnx\") pod \"controller-manager-9b9b64d5f-gkln6\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.529504 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-config\") pod \"controller-manager-9b9b64d5f-gkln6\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.530485 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-client-ca\") pod \"controller-manager-9b9b64d5f-gkln6\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.531042 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-proxy-ca-bundles\") pod \"controller-manager-9b9b64d5f-gkln6\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.531767 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-client-ca\") pod \"controller-manager-9b9b64d5f-gkln6\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.531854 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a6773a5-7f14-468e-b42c-b0b35a92137c-serving-cert\") pod \"controller-manager-9b9b64d5f-gkln6\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.533139 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-config\") pod \"controller-manager-9b9b64d5f-gkln6\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.533322 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e19a512d-e436-45b4-9701-0121028e4826-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.533344 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq569\" (UniqueName: \"kubernetes.io/projected/e19a512d-e436-45b4-9701-0121028e4826-kube-api-access-fq569\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.533358 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c526c9db-5b46-4c19-be12-1b59f7f622dc-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.533370 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e19a512d-e436-45b4-9701-0121028e4826-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.533381 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw2rl\" (UniqueName: \"kubernetes.io/projected/c526c9db-5b46-4c19-be12-1b59f7f622dc-kube-api-access-xw2rl\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.533391 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e19a512d-e436-45b4-9701-0121028e4826-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.533402 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c526c9db-5b46-4c19-be12-1b59f7f622dc-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.533909 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-proxy-ca-bundles\") pod \"controller-manager-9b9b64d5f-gkln6\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.538771 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a6773a5-7f14-468e-b42c-b0b35a92137c-serving-cert\") pod \"controller-manager-9b9b64d5f-gkln6\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.550829 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prwnx\" (UniqueName: \"kubernetes.io/projected/4a6773a5-7f14-468e-b42c-b0b35a92137c-kube-api-access-prwnx\") pod \"controller-manager-9b9b64d5f-gkln6\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.613361 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.853584 4885 generic.go:334] "Generic (PLEG): container finished" podID="c526c9db-5b46-4c19-be12-1b59f7f622dc" containerID="b2324af3bb9eb06af777ba4865aea34c3a34bb7a798dd15db46941af0af0a15d" exitCode=0 Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.853639 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" event={"ID":"c526c9db-5b46-4c19-be12-1b59f7f622dc","Type":"ContainerDied","Data":"b2324af3bb9eb06af777ba4865aea34c3a34bb7a798dd15db46941af0af0a15d"} Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.854046 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" event={"ID":"c526c9db-5b46-4c19-be12-1b59f7f622dc","Type":"ContainerDied","Data":"7f45fa8dfef908f246059747d32e0f5cd26ab6e3a4ea4beb783f4e11bbf71bdc"} Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.854078 4885 scope.go:117] "RemoveContainer" containerID="b2324af3bb9eb06af777ba4865aea34c3a34bb7a798dd15db46941af0af0a15d" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.853678 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58975fd7b8-f2hfk" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.856987 4885 generic.go:334] "Generic (PLEG): container finished" podID="e19a512d-e436-45b4-9701-0121028e4826" containerID="f86e1af98fc7e7b3977eb93386d39f35103a4988f47b17edf673b06c4892fca7" exitCode=0 Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.857163 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.857191 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" event={"ID":"e19a512d-e436-45b4-9701-0121028e4826","Type":"ContainerDied","Data":"f86e1af98fc7e7b3977eb93386d39f35103a4988f47b17edf673b06c4892fca7"} Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.857310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq" event={"ID":"e19a512d-e436-45b4-9701-0121028e4826","Type":"ContainerDied","Data":"cd51063b9287659f49b46d9387deda2e4a842a38aeeefd41c07b53ac04c09bd2"} Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.874800 4885 scope.go:117] "RemoveContainer" containerID="b2324af3bb9eb06af777ba4865aea34c3a34bb7a798dd15db46941af0af0a15d" Dec 05 20:11:17 crc kubenswrapper[4885]: E1205 20:11:17.875254 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2324af3bb9eb06af777ba4865aea34c3a34bb7a798dd15db46941af0af0a15d\": container with ID starting with b2324af3bb9eb06af777ba4865aea34c3a34bb7a798dd15db46941af0af0a15d not found: ID does not exist" containerID="b2324af3bb9eb06af777ba4865aea34c3a34bb7a798dd15db46941af0af0a15d" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.875311 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2324af3bb9eb06af777ba4865aea34c3a34bb7a798dd15db46941af0af0a15d"} err="failed to get container status \"b2324af3bb9eb06af777ba4865aea34c3a34bb7a798dd15db46941af0af0a15d\": rpc error: code = NotFound desc = could not find container \"b2324af3bb9eb06af777ba4865aea34c3a34bb7a798dd15db46941af0af0a15d\": container with ID starting with b2324af3bb9eb06af777ba4865aea34c3a34bb7a798dd15db46941af0af0a15d not found: ID does not exist" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.875348 4885 scope.go:117] "RemoveContainer" containerID="f86e1af98fc7e7b3977eb93386d39f35103a4988f47b17edf673b06c4892fca7" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.894850 4885 scope.go:117] "RemoveContainer" containerID="f86e1af98fc7e7b3977eb93386d39f35103a4988f47b17edf673b06c4892fca7" Dec 05 20:11:17 crc kubenswrapper[4885]: E1205 20:11:17.895451 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f86e1af98fc7e7b3977eb93386d39f35103a4988f47b17edf673b06c4892fca7\": container with ID starting with f86e1af98fc7e7b3977eb93386d39f35103a4988f47b17edf673b06c4892fca7 not found: ID does not exist" containerID="f86e1af98fc7e7b3977eb93386d39f35103a4988f47b17edf673b06c4892fca7" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.895535 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f86e1af98fc7e7b3977eb93386d39f35103a4988f47b17edf673b06c4892fca7"} err="failed to get container status \"f86e1af98fc7e7b3977eb93386d39f35103a4988f47b17edf673b06c4892fca7\": rpc error: code = NotFound desc = could not find container \"f86e1af98fc7e7b3977eb93386d39f35103a4988f47b17edf673b06c4892fca7\": container with ID starting with f86e1af98fc7e7b3977eb93386d39f35103a4988f47b17edf673b06c4892fca7 not found: ID does not exist" Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.896234 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq"] Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.901075 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6495bf885b-p69qq"] Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.905168 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58975fd7b8-f2hfk"] Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.907886 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-58975fd7b8-f2hfk"] Dec 05 20:11:17 crc kubenswrapper[4885]: I1205 20:11:17.910577 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9b9b64d5f-gkln6"] Dec 05 20:11:17 crc kubenswrapper[4885]: W1205 20:11:17.911857 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a6773a5_7f14_468e_b42c_b0b35a92137c.slice/crio-0029daa0fd7bf150a2c67e2341ff77289737ddbf82c606e60046715690e1a02a WatchSource:0}: Error finding container 0029daa0fd7bf150a2c67e2341ff77289737ddbf82c606e60046715690e1a02a: Status 404 returned error can't find the container with id 0029daa0fd7bf150a2c67e2341ff77289737ddbf82c606e60046715690e1a02a Dec 05 20:11:18 crc kubenswrapper[4885]: I1205 20:11:18.869897 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" event={"ID":"4a6773a5-7f14-468e-b42c-b0b35a92137c","Type":"ContainerStarted","Data":"19555ac5f88cbf6db142aee70b07dd052f11749b19f9fbc2fd6c4d5c87ee0d18"} Dec 05 20:11:18 crc kubenswrapper[4885]: I1205 20:11:18.870253 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" event={"ID":"4a6773a5-7f14-468e-b42c-b0b35a92137c","Type":"ContainerStarted","Data":"0029daa0fd7bf150a2c67e2341ff77289737ddbf82c606e60046715690e1a02a"} Dec 05 20:11:18 crc kubenswrapper[4885]: I1205 20:11:18.870275 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:18 crc kubenswrapper[4885]: I1205 20:11:18.876647 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:18 crc kubenswrapper[4885]: I1205 20:11:18.891302 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" podStartSLOduration=3.891276911 podStartE2EDuration="3.891276911s" podCreationTimestamp="2025-12-05 20:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:11:18.889683228 +0000 UTC m=+344.186498899" watchObservedRunningTime="2025-12-05 20:11:18.891276911 +0000 UTC m=+344.188092572" Dec 05 20:11:19 crc kubenswrapper[4885]: I1205 20:11:19.183324 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c526c9db-5b46-4c19-be12-1b59f7f622dc" path="/var/lib/kubelet/pods/c526c9db-5b46-4c19-be12-1b59f7f622dc/volumes" Dec 05 20:11:19 crc kubenswrapper[4885]: I1205 20:11:19.183951 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e19a512d-e436-45b4-9701-0121028e4826" path="/var/lib/kubelet/pods/e19a512d-e436-45b4-9701-0121028e4826/volumes" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.106397 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82"] Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.108078 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.110276 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.110504 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.110908 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.111051 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.111198 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.111206 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.114452 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82"] Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.182981 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g8sv\" (UniqueName: \"kubernetes.io/projected/4d23e814-1c58-45de-87b1-d0fe9f308ef6-kube-api-access-7g8sv\") pod \"route-controller-manager-866f46fcdc-c5k82\" (UID: \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.183078 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d23e814-1c58-45de-87b1-d0fe9f308ef6-config\") pod \"route-controller-manager-866f46fcdc-c5k82\" (UID: \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.183159 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d23e814-1c58-45de-87b1-d0fe9f308ef6-client-ca\") pod \"route-controller-manager-866f46fcdc-c5k82\" (UID: \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.183211 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d23e814-1c58-45de-87b1-d0fe9f308ef6-serving-cert\") pod \"route-controller-manager-866f46fcdc-c5k82\" (UID: \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.289451 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d23e814-1c58-45de-87b1-d0fe9f308ef6-client-ca\") pod \"route-controller-manager-866f46fcdc-c5k82\" (UID: \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.289558 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d23e814-1c58-45de-87b1-d0fe9f308ef6-serving-cert\") pod \"route-controller-manager-866f46fcdc-c5k82\" (UID: \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.289664 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g8sv\" (UniqueName: \"kubernetes.io/projected/4d23e814-1c58-45de-87b1-d0fe9f308ef6-kube-api-access-7g8sv\") pod \"route-controller-manager-866f46fcdc-c5k82\" (UID: \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.289697 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d23e814-1c58-45de-87b1-d0fe9f308ef6-config\") pod \"route-controller-manager-866f46fcdc-c5k82\" (UID: \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.290569 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d23e814-1c58-45de-87b1-d0fe9f308ef6-client-ca\") pod \"route-controller-manager-866f46fcdc-c5k82\" (UID: \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.291332 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d23e814-1c58-45de-87b1-d0fe9f308ef6-config\") pod \"route-controller-manager-866f46fcdc-c5k82\" (UID: \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.295592 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d23e814-1c58-45de-87b1-d0fe9f308ef6-serving-cert\") pod \"route-controller-manager-866f46fcdc-c5k82\" (UID: \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.314899 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g8sv\" (UniqueName: \"kubernetes.io/projected/4d23e814-1c58-45de-87b1-d0fe9f308ef6-kube-api-access-7g8sv\") pod \"route-controller-manager-866f46fcdc-c5k82\" (UID: \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.428356 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.857085 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82"] Dec 05 20:11:20 crc kubenswrapper[4885]: W1205 20:11:20.864767 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d23e814_1c58_45de_87b1_d0fe9f308ef6.slice/crio-772d0b43e89cd3474bc3cd2956a1d2338b0e07e7aa635ea278c4be763152f3ba WatchSource:0}: Error finding container 772d0b43e89cd3474bc3cd2956a1d2338b0e07e7aa635ea278c4be763152f3ba: Status 404 returned error can't find the container with id 772d0b43e89cd3474bc3cd2956a1d2338b0e07e7aa635ea278c4be763152f3ba Dec 05 20:11:20 crc kubenswrapper[4885]: I1205 20:11:20.883868 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" event={"ID":"4d23e814-1c58-45de-87b1-d0fe9f308ef6","Type":"ContainerStarted","Data":"772d0b43e89cd3474bc3cd2956a1d2338b0e07e7aa635ea278c4be763152f3ba"} Dec 05 20:11:21 crc kubenswrapper[4885]: I1205 20:11:21.891040 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" event={"ID":"4d23e814-1c58-45de-87b1-d0fe9f308ef6","Type":"ContainerStarted","Data":"2d5d78f209f0c9d31aff2e6917474b1fdb7f98c70d317a2b448d184b3f66a73c"} Dec 05 20:11:21 crc kubenswrapper[4885]: I1205 20:11:21.891348 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:21 crc kubenswrapper[4885]: I1205 20:11:21.899808 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:21 crc kubenswrapper[4885]: I1205 20:11:21.930979 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" podStartSLOduration=6.930959212 podStartE2EDuration="6.930959212s" podCreationTimestamp="2025-12-05 20:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:11:21.914630796 +0000 UTC m=+347.211446467" watchObservedRunningTime="2025-12-05 20:11:21.930959212 +0000 UTC m=+347.227774883" Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.169289 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82"] Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.170154 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" podUID="4d23e814-1c58-45de-87b1-d0fe9f308ef6" containerName="route-controller-manager" containerID="cri-o://2d5d78f209f0c9d31aff2e6917474b1fdb7f98c70d317a2b448d184b3f66a73c" gracePeriod=30 Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.620277 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.802104 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d23e814-1c58-45de-87b1-d0fe9f308ef6-config\") pod \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\" (UID: \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\") " Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.802177 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g8sv\" (UniqueName: \"kubernetes.io/projected/4d23e814-1c58-45de-87b1-d0fe9f308ef6-kube-api-access-7g8sv\") pod \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\" (UID: \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\") " Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.802221 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d23e814-1c58-45de-87b1-d0fe9f308ef6-serving-cert\") pod \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\" (UID: \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\") " Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.802320 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d23e814-1c58-45de-87b1-d0fe9f308ef6-client-ca\") pod \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\" (UID: \"4d23e814-1c58-45de-87b1-d0fe9f308ef6\") " Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.802943 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d23e814-1c58-45de-87b1-d0fe9f308ef6-client-ca" (OuterVolumeSpecName: "client-ca") pod "4d23e814-1c58-45de-87b1-d0fe9f308ef6" (UID: "4d23e814-1c58-45de-87b1-d0fe9f308ef6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.802984 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d23e814-1c58-45de-87b1-d0fe9f308ef6-config" (OuterVolumeSpecName: "config") pod "4d23e814-1c58-45de-87b1-d0fe9f308ef6" (UID: "4d23e814-1c58-45de-87b1-d0fe9f308ef6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.808297 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d23e814-1c58-45de-87b1-d0fe9f308ef6-kube-api-access-7g8sv" (OuterVolumeSpecName: "kube-api-access-7g8sv") pod "4d23e814-1c58-45de-87b1-d0fe9f308ef6" (UID: "4d23e814-1c58-45de-87b1-d0fe9f308ef6"). InnerVolumeSpecName "kube-api-access-7g8sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.808897 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d23e814-1c58-45de-87b1-d0fe9f308ef6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4d23e814-1c58-45de-87b1-d0fe9f308ef6" (UID: "4d23e814-1c58-45de-87b1-d0fe9f308ef6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.903166 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d23e814-1c58-45de-87b1-d0fe9f308ef6-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.903213 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d23e814-1c58-45de-87b1-d0fe9f308ef6-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.903228 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g8sv\" (UniqueName: \"kubernetes.io/projected/4d23e814-1c58-45de-87b1-d0fe9f308ef6-kube-api-access-7g8sv\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.903243 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d23e814-1c58-45de-87b1-d0fe9f308ef6-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.962257 4885 generic.go:334] "Generic (PLEG): container finished" podID="4d23e814-1c58-45de-87b1-d0fe9f308ef6" containerID="2d5d78f209f0c9d31aff2e6917474b1fdb7f98c70d317a2b448d184b3f66a73c" exitCode=0 Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.962296 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" event={"ID":"4d23e814-1c58-45de-87b1-d0fe9f308ef6","Type":"ContainerDied","Data":"2d5d78f209f0c9d31aff2e6917474b1fdb7f98c70d317a2b448d184b3f66a73c"} Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.962328 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" event={"ID":"4d23e814-1c58-45de-87b1-d0fe9f308ef6","Type":"ContainerDied","Data":"772d0b43e89cd3474bc3cd2956a1d2338b0e07e7aa635ea278c4be763152f3ba"} Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.962348 4885 scope.go:117] "RemoveContainer" containerID="2d5d78f209f0c9d31aff2e6917474b1fdb7f98c70d317a2b448d184b3f66a73c" Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.962369 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82" Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.980232 4885 scope.go:117] "RemoveContainer" containerID="2d5d78f209f0c9d31aff2e6917474b1fdb7f98c70d317a2b448d184b3f66a73c" Dec 05 20:11:33 crc kubenswrapper[4885]: E1205 20:11:33.980637 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5d78f209f0c9d31aff2e6917474b1fdb7f98c70d317a2b448d184b3f66a73c\": container with ID starting with 2d5d78f209f0c9d31aff2e6917474b1fdb7f98c70d317a2b448d184b3f66a73c not found: ID does not exist" containerID="2d5d78f209f0c9d31aff2e6917474b1fdb7f98c70d317a2b448d184b3f66a73c" Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.980671 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5d78f209f0c9d31aff2e6917474b1fdb7f98c70d317a2b448d184b3f66a73c"} err="failed to get container status \"2d5d78f209f0c9d31aff2e6917474b1fdb7f98c70d317a2b448d184b3f66a73c\": rpc error: code = NotFound desc = could not find container \"2d5d78f209f0c9d31aff2e6917474b1fdb7f98c70d317a2b448d184b3f66a73c\": container with ID starting with 2d5d78f209f0c9d31aff2e6917474b1fdb7f98c70d317a2b448d184b3f66a73c not found: ID does not exist" Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.994132 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82"] Dec 05 20:11:33 crc kubenswrapper[4885]: I1205 20:11:33.999705 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866f46fcdc-c5k82"] Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.116325 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz"] Dec 05 20:11:35 crc kubenswrapper[4885]: E1205 20:11:35.117825 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d23e814-1c58-45de-87b1-d0fe9f308ef6" containerName="route-controller-manager" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.117971 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d23e814-1c58-45de-87b1-d0fe9f308ef6" containerName="route-controller-manager" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.118244 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d23e814-1c58-45de-87b1-d0fe9f308ef6" containerName="route-controller-manager" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.118900 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.123582 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.123790 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.123944 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.124389 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.124955 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.125254 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.130425 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz"] Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.179428 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d23e814-1c58-45de-87b1-d0fe9f308ef6" path="/var/lib/kubelet/pods/4d23e814-1c58-45de-87b1-d0fe9f308ef6/volumes" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.219255 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eacd5468-cdc8-40dc-9035-2c31f9051130-config\") pod \"route-controller-manager-6495bf885b-n74sz\" (UID: \"eacd5468-cdc8-40dc-9035-2c31f9051130\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.219300 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eacd5468-cdc8-40dc-9035-2c31f9051130-serving-cert\") pod \"route-controller-manager-6495bf885b-n74sz\" (UID: \"eacd5468-cdc8-40dc-9035-2c31f9051130\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.219408 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxt2t\" (UniqueName: \"kubernetes.io/projected/eacd5468-cdc8-40dc-9035-2c31f9051130-kube-api-access-pxt2t\") pod \"route-controller-manager-6495bf885b-n74sz\" (UID: \"eacd5468-cdc8-40dc-9035-2c31f9051130\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.219431 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eacd5468-cdc8-40dc-9035-2c31f9051130-client-ca\") pod \"route-controller-manager-6495bf885b-n74sz\" (UID: \"eacd5468-cdc8-40dc-9035-2c31f9051130\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.320574 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eacd5468-cdc8-40dc-9035-2c31f9051130-serving-cert\") pod \"route-controller-manager-6495bf885b-n74sz\" (UID: \"eacd5468-cdc8-40dc-9035-2c31f9051130\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.320653 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxt2t\" (UniqueName: \"kubernetes.io/projected/eacd5468-cdc8-40dc-9035-2c31f9051130-kube-api-access-pxt2t\") pod \"route-controller-manager-6495bf885b-n74sz\" (UID: \"eacd5468-cdc8-40dc-9035-2c31f9051130\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.320679 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eacd5468-cdc8-40dc-9035-2c31f9051130-client-ca\") pod \"route-controller-manager-6495bf885b-n74sz\" (UID: \"eacd5468-cdc8-40dc-9035-2c31f9051130\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.320708 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eacd5468-cdc8-40dc-9035-2c31f9051130-config\") pod \"route-controller-manager-6495bf885b-n74sz\" (UID: \"eacd5468-cdc8-40dc-9035-2c31f9051130\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.321825 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eacd5468-cdc8-40dc-9035-2c31f9051130-config\") pod \"route-controller-manager-6495bf885b-n74sz\" (UID: \"eacd5468-cdc8-40dc-9035-2c31f9051130\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.322251 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eacd5468-cdc8-40dc-9035-2c31f9051130-client-ca\") pod \"route-controller-manager-6495bf885b-n74sz\" (UID: \"eacd5468-cdc8-40dc-9035-2c31f9051130\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.326087 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eacd5468-cdc8-40dc-9035-2c31f9051130-serving-cert\") pod \"route-controller-manager-6495bf885b-n74sz\" (UID: \"eacd5468-cdc8-40dc-9035-2c31f9051130\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.335474 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxt2t\" (UniqueName: \"kubernetes.io/projected/eacd5468-cdc8-40dc-9035-2c31f9051130-kube-api-access-pxt2t\") pod \"route-controller-manager-6495bf885b-n74sz\" (UID: \"eacd5468-cdc8-40dc-9035-2c31f9051130\") " pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.446425 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.878222 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz"] Dec 05 20:11:35 crc kubenswrapper[4885]: I1205 20:11:35.985845 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" event={"ID":"eacd5468-cdc8-40dc-9035-2c31f9051130","Type":"ContainerStarted","Data":"32bbeada50b1d472ba0985bfd1b4fe9b4f84c223784ef234e0e0f3695837c9b2"} Dec 05 20:11:36 crc kubenswrapper[4885]: I1205 20:11:36.991989 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" event={"ID":"eacd5468-cdc8-40dc-9035-2c31f9051130","Type":"ContainerStarted","Data":"a3383752d93b3057a1fe4b8b221c3fae87e9a93d5138b9bf96bc44df2a9b8d63"} Dec 05 20:11:36 crc kubenswrapper[4885]: I1205 20:11:36.992428 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" Dec 05 20:11:36 crc kubenswrapper[4885]: I1205 20:11:36.997267 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" Dec 05 20:11:37 crc kubenswrapper[4885]: I1205 20:11:37.011434 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6495bf885b-n74sz" podStartSLOduration=4.011414116 podStartE2EDuration="4.011414116s" podCreationTimestamp="2025-12-05 20:11:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:11:37.006839092 +0000 UTC m=+362.303654773" watchObservedRunningTime="2025-12-05 20:11:37.011414116 +0000 UTC m=+362.308229777" Dec 05 20:11:46 crc kubenswrapper[4885]: I1205 20:11:46.631493 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:11:46 crc kubenswrapper[4885]: I1205 20:11:46.632240 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.015590 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7gkh4"] Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.016864 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.030542 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7gkh4"] Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.141133 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-registry-tls\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.141181 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-trusted-ca\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.141210 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.141233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.141250 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.141312 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-bound-sa-token\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.141331 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvcjr\" (UniqueName: \"kubernetes.io/projected/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-kube-api-access-fvcjr\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.141353 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-registry-certificates\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.165602 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.242830 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-registry-tls\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.242906 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-trusted-ca\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.242962 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.243001 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.243132 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-bound-sa-token\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.243165 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvcjr\" (UniqueName: \"kubernetes.io/projected/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-kube-api-access-fvcjr\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.243202 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-registry-certificates\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.243514 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.245070 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-trusted-ca\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.245841 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-registry-certificates\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.249285 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.249490 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-registry-tls\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.258479 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-bound-sa-token\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.260309 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvcjr\" (UniqueName: \"kubernetes.io/projected/41470e0d-1c04-4f18-b4de-0d7c9611a8ba-kube-api-access-fvcjr\") pod \"image-registry-66df7c8f76-7gkh4\" (UID: \"41470e0d-1c04-4f18-b4de-0d7c9611a8ba\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.339464 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:51 crc kubenswrapper[4885]: I1205 20:11:51.754699 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7gkh4"] Dec 05 20:11:51 crc kubenswrapper[4885]: W1205 20:11:51.761830 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41470e0d_1c04_4f18_b4de_0d7c9611a8ba.slice/crio-8694229496b50a39caf137662cf909132c8e79eaab2e53e231171eafb6cc2137 WatchSource:0}: Error finding container 8694229496b50a39caf137662cf909132c8e79eaab2e53e231171eafb6cc2137: Status 404 returned error can't find the container with id 8694229496b50a39caf137662cf909132c8e79eaab2e53e231171eafb6cc2137 Dec 05 20:11:52 crc kubenswrapper[4885]: I1205 20:11:52.096412 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" event={"ID":"41470e0d-1c04-4f18-b4de-0d7c9611a8ba","Type":"ContainerStarted","Data":"dcb99c85284bf873a1d6da21536fd72c58226b37d3966276441bc0f6e8f1a5a5"} Dec 05 20:11:52 crc kubenswrapper[4885]: I1205 20:11:52.096929 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" event={"ID":"41470e0d-1c04-4f18-b4de-0d7c9611a8ba","Type":"ContainerStarted","Data":"8694229496b50a39caf137662cf909132c8e79eaab2e53e231171eafb6cc2137"} Dec 05 20:11:52 crc kubenswrapper[4885]: I1205 20:11:52.097053 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:11:52 crc kubenswrapper[4885]: I1205 20:11:52.122989 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" podStartSLOduration=2.122964202 podStartE2EDuration="2.122964202s" podCreationTimestamp="2025-12-05 20:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:11:52.118633675 +0000 UTC m=+377.415449376" watchObservedRunningTime="2025-12-05 20:11:52.122964202 +0000 UTC m=+377.419779883" Dec 05 20:11:52 crc kubenswrapper[4885]: I1205 20:11:52.730261 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9b9b64d5f-gkln6"] Dec 05 20:11:52 crc kubenswrapper[4885]: I1205 20:11:52.730480 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" podUID="4a6773a5-7f14-468e-b42c-b0b35a92137c" containerName="controller-manager" containerID="cri-o://19555ac5f88cbf6db142aee70b07dd052f11749b19f9fbc2fd6c4d5c87ee0d18" gracePeriod=30 Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.101816 4885 generic.go:334] "Generic (PLEG): container finished" podID="4a6773a5-7f14-468e-b42c-b0b35a92137c" containerID="19555ac5f88cbf6db142aee70b07dd052f11749b19f9fbc2fd6c4d5c87ee0d18" exitCode=0 Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.102080 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" event={"ID":"4a6773a5-7f14-468e-b42c-b0b35a92137c","Type":"ContainerDied","Data":"19555ac5f88cbf6db142aee70b07dd052f11749b19f9fbc2fd6c4d5c87ee0d18"} Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.146765 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.273411 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-proxy-ca-bundles\") pod \"4a6773a5-7f14-468e-b42c-b0b35a92137c\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.273494 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a6773a5-7f14-468e-b42c-b0b35a92137c-serving-cert\") pod \"4a6773a5-7f14-468e-b42c-b0b35a92137c\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.273537 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-config\") pod \"4a6773a5-7f14-468e-b42c-b0b35a92137c\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.273622 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prwnx\" (UniqueName: \"kubernetes.io/projected/4a6773a5-7f14-468e-b42c-b0b35a92137c-kube-api-access-prwnx\") pod \"4a6773a5-7f14-468e-b42c-b0b35a92137c\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.273666 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-client-ca\") pod \"4a6773a5-7f14-468e-b42c-b0b35a92137c\" (UID: \"4a6773a5-7f14-468e-b42c-b0b35a92137c\") " Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.274596 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4a6773a5-7f14-468e-b42c-b0b35a92137c" (UID: "4a6773a5-7f14-468e-b42c-b0b35a92137c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.274653 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-config" (OuterVolumeSpecName: "config") pod "4a6773a5-7f14-468e-b42c-b0b35a92137c" (UID: "4a6773a5-7f14-468e-b42c-b0b35a92137c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.274764 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-client-ca" (OuterVolumeSpecName: "client-ca") pod "4a6773a5-7f14-468e-b42c-b0b35a92137c" (UID: "4a6773a5-7f14-468e-b42c-b0b35a92137c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.281280 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6773a5-7f14-468e-b42c-b0b35a92137c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4a6773a5-7f14-468e-b42c-b0b35a92137c" (UID: "4a6773a5-7f14-468e-b42c-b0b35a92137c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.281343 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a6773a5-7f14-468e-b42c-b0b35a92137c-kube-api-access-prwnx" (OuterVolumeSpecName: "kube-api-access-prwnx") pod "4a6773a5-7f14-468e-b42c-b0b35a92137c" (UID: "4a6773a5-7f14-468e-b42c-b0b35a92137c"). InnerVolumeSpecName "kube-api-access-prwnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.376307 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.376361 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a6773a5-7f14-468e-b42c-b0b35a92137c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.376379 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.376398 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prwnx\" (UniqueName: \"kubernetes.io/projected/4a6773a5-7f14-468e-b42c-b0b35a92137c-kube-api-access-prwnx\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:53 crc kubenswrapper[4885]: I1205 20:11:53.376420 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a6773a5-7f14-468e-b42c-b0b35a92137c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.110073 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" event={"ID":"4a6773a5-7f14-468e-b42c-b0b35a92137c","Type":"ContainerDied","Data":"0029daa0fd7bf150a2c67e2341ff77289737ddbf82c606e60046715690e1a02a"} Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.110181 4885 scope.go:117] "RemoveContainer" containerID="19555ac5f88cbf6db142aee70b07dd052f11749b19f9fbc2fd6c4d5c87ee0d18" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.110103 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9b9b64d5f-gkln6" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.133167 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-58975fd7b8-hj5x8"] Dec 05 20:11:54 crc kubenswrapper[4885]: E1205 20:11:54.134009 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6773a5-7f14-468e-b42c-b0b35a92137c" containerName="controller-manager" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.134103 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6773a5-7f14-468e-b42c-b0b35a92137c" containerName="controller-manager" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.134408 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6773a5-7f14-468e-b42c-b0b35a92137c" containerName="controller-manager" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.136532 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.142722 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.142769 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.143330 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.144213 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.144818 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.145130 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.147491 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58975fd7b8-hj5x8"] Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.155933 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.195117 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9b9b64d5f-gkln6"] Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.199396 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-9b9b64d5f-gkln6"] Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.288393 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe2a56a0-d5a1-4748-9fc2-8111bedf89f7-client-ca\") pod \"controller-manager-58975fd7b8-hj5x8\" (UID: \"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.288465 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2a56a0-d5a1-4748-9fc2-8111bedf89f7-config\") pod \"controller-manager-58975fd7b8-hj5x8\" (UID: \"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.288512 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe2a56a0-d5a1-4748-9fc2-8111bedf89f7-proxy-ca-bundles\") pod \"controller-manager-58975fd7b8-hj5x8\" (UID: \"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.288848 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbp67\" (UniqueName: \"kubernetes.io/projected/fe2a56a0-d5a1-4748-9fc2-8111bedf89f7-kube-api-access-sbp67\") pod \"controller-manager-58975fd7b8-hj5x8\" (UID: \"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.288938 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe2a56a0-d5a1-4748-9fc2-8111bedf89f7-serving-cert\") pod \"controller-manager-58975fd7b8-hj5x8\" (UID: \"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.390795 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe2a56a0-d5a1-4748-9fc2-8111bedf89f7-client-ca\") pod \"controller-manager-58975fd7b8-hj5x8\" (UID: \"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.390897 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2a56a0-d5a1-4748-9fc2-8111bedf89f7-config\") pod \"controller-manager-58975fd7b8-hj5x8\" (UID: \"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.390982 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe2a56a0-d5a1-4748-9fc2-8111bedf89f7-proxy-ca-bundles\") pod \"controller-manager-58975fd7b8-hj5x8\" (UID: \"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.391119 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbp67\" (UniqueName: \"kubernetes.io/projected/fe2a56a0-d5a1-4748-9fc2-8111bedf89f7-kube-api-access-sbp67\") pod \"controller-manager-58975fd7b8-hj5x8\" (UID: \"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.391180 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe2a56a0-d5a1-4748-9fc2-8111bedf89f7-serving-cert\") pod \"controller-manager-58975fd7b8-hj5x8\" (UID: \"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.392853 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe2a56a0-d5a1-4748-9fc2-8111bedf89f7-proxy-ca-bundles\") pod \"controller-manager-58975fd7b8-hj5x8\" (UID: \"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.393336 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2a56a0-d5a1-4748-9fc2-8111bedf89f7-config\") pod \"controller-manager-58975fd7b8-hj5x8\" (UID: \"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.393701 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe2a56a0-d5a1-4748-9fc2-8111bedf89f7-client-ca\") pod \"controller-manager-58975fd7b8-hj5x8\" (UID: \"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.398647 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe2a56a0-d5a1-4748-9fc2-8111bedf89f7-serving-cert\") pod \"controller-manager-58975fd7b8-hj5x8\" (UID: \"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.424903 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbp67\" (UniqueName: \"kubernetes.io/projected/fe2a56a0-d5a1-4748-9fc2-8111bedf89f7-kube-api-access-sbp67\") pod \"controller-manager-58975fd7b8-hj5x8\" (UID: \"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7\") " pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.467279 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:54 crc kubenswrapper[4885]: I1205 20:11:54.943906 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58975fd7b8-hj5x8"] Dec 05 20:11:54 crc kubenswrapper[4885]: W1205 20:11:54.952127 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe2a56a0_d5a1_4748_9fc2_8111bedf89f7.slice/crio-e971b2c46b6806e5b603d6eac2ab0132ebae1c4b2c06bb7e4d5c3024f23a3d54 WatchSource:0}: Error finding container e971b2c46b6806e5b603d6eac2ab0132ebae1c4b2c06bb7e4d5c3024f23a3d54: Status 404 returned error can't find the container with id e971b2c46b6806e5b603d6eac2ab0132ebae1c4b2c06bb7e4d5c3024f23a3d54 Dec 05 20:11:55 crc kubenswrapper[4885]: I1205 20:11:55.121332 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" event={"ID":"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7","Type":"ContainerStarted","Data":"8ba238f116b1716fcc8ab1a326fb587024b9cafbbb6fd083b9fc20ea7b4d7033"} Dec 05 20:11:55 crc kubenswrapper[4885]: I1205 20:11:55.121779 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:11:55 crc kubenswrapper[4885]: I1205 20:11:55.121836 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" event={"ID":"fe2a56a0-d5a1-4748-9fc2-8111bedf89f7","Type":"ContainerStarted","Data":"e971b2c46b6806e5b603d6eac2ab0132ebae1c4b2c06bb7e4d5c3024f23a3d54"} Dec 05 20:11:55 crc kubenswrapper[4885]: I1205 20:11:55.122658 4885 patch_prober.go:28] interesting pod/controller-manager-58975fd7b8-hj5x8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" start-of-body= Dec 05 20:11:55 crc kubenswrapper[4885]: I1205 20:11:55.122719 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" podUID="fe2a56a0-d5a1-4748-9fc2-8111bedf89f7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" Dec 05 20:11:55 crc kubenswrapper[4885]: I1205 20:11:55.148113 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" podStartSLOduration=3.148076671 podStartE2EDuration="3.148076671s" podCreationTimestamp="2025-12-05 20:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:11:55.140307927 +0000 UTC m=+380.437123618" watchObservedRunningTime="2025-12-05 20:11:55.148076671 +0000 UTC m=+380.444892382" Dec 05 20:11:55 crc kubenswrapper[4885]: I1205 20:11:55.179390 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a6773a5-7f14-468e-b42c-b0b35a92137c" path="/var/lib/kubelet/pods/4a6773a5-7f14-468e-b42c-b0b35a92137c/volumes" Dec 05 20:11:56 crc kubenswrapper[4885]: I1205 20:11:56.131916 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-58975fd7b8-hj5x8" Dec 05 20:12:01 crc kubenswrapper[4885]: I1205 20:12:01.835843 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-28gdb"] Dec 05 20:12:01 crc kubenswrapper[4885]: I1205 20:12:01.837806 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-28gdb" podUID="61f67e39-acf3-4ec4-af3f-68159973345e" containerName="registry-server" containerID="cri-o://bfc8f52385ce855919cfb4fb6c9b1f526b1f4c727797bec0c95fb7548b8a87ec" gracePeriod=30 Dec 05 20:12:01 crc kubenswrapper[4885]: I1205 20:12:01.865826 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-24244"] Dec 05 20:12:01 crc kubenswrapper[4885]: I1205 20:12:01.866314 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-24244" podUID="746679e1-b958-4320-bc6c-00060a83db3f" containerName="registry-server" containerID="cri-o://61990de48711f3bac0a406b6c1f3c6f0caeeb6c924675330235b61763540d3f7" gracePeriod=30 Dec 05 20:12:01 crc kubenswrapper[4885]: I1205 20:12:01.872008 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n7qfd"] Dec 05 20:12:01 crc kubenswrapper[4885]: I1205 20:12:01.872286 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" podUID="106ffd61-239f-4707-b999-aa044f6f30ae" containerName="marketplace-operator" containerID="cri-o://efadd20e9f956c6cca8f25167f14ab63bfcd7f50f8c4d5b6d6a10248e0b3f634" gracePeriod=30 Dec 05 20:12:01 crc kubenswrapper[4885]: I1205 20:12:01.881990 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g2dmb"] Dec 05 20:12:01 crc kubenswrapper[4885]: I1205 20:12:01.882363 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g2dmb" podUID="61122263-3d9d-4510-87bc-6e8ff3bf7af5" containerName="registry-server" containerID="cri-o://43278e7cf72b3bded009173d27e3c91220d6255b10cd3f3a86db991ad544fe57" gracePeriod=30 Dec 05 20:12:01 crc kubenswrapper[4885]: I1205 20:12:01.892110 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-djpjw"] Dec 05 20:12:01 crc kubenswrapper[4885]: I1205 20:12:01.893151 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" Dec 05 20:12:01 crc kubenswrapper[4885]: I1205 20:12:01.908586 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n9hmw"] Dec 05 20:12:01 crc kubenswrapper[4885]: I1205 20:12:01.908947 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n9hmw" podUID="60a86faf-f4ad-4e5a-b614-4c90d228b05f" containerName="registry-server" containerID="cri-o://df925bd36fa538dc6f19a012f9f9aa71e667ca946f5b1a0343d9d29ed41bad59" gracePeriod=30 Dec 05 20:12:01 crc kubenswrapper[4885]: I1205 20:12:01.916434 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-djpjw"] Dec 05 20:12:01 crc kubenswrapper[4885]: I1205 20:12:01.932127 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7708f77-d399-4d7e-8034-9e043e56aabe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-djpjw\" (UID: \"b7708f77-d399-4d7e-8034-9e043e56aabe\") " pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" Dec 05 20:12:01 crc kubenswrapper[4885]: I1205 20:12:01.932608 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tgfx\" (UniqueName: \"kubernetes.io/projected/b7708f77-d399-4d7e-8034-9e043e56aabe-kube-api-access-7tgfx\") pod \"marketplace-operator-79b997595-djpjw\" (UID: \"b7708f77-d399-4d7e-8034-9e043e56aabe\") " pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" Dec 05 20:12:01 crc kubenswrapper[4885]: I1205 20:12:01.932648 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b7708f77-d399-4d7e-8034-9e043e56aabe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-djpjw\" (UID: \"b7708f77-d399-4d7e-8034-9e043e56aabe\") " pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.034267 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7708f77-d399-4d7e-8034-9e043e56aabe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-djpjw\" (UID: \"b7708f77-d399-4d7e-8034-9e043e56aabe\") " pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.034447 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tgfx\" (UniqueName: \"kubernetes.io/projected/b7708f77-d399-4d7e-8034-9e043e56aabe-kube-api-access-7tgfx\") pod \"marketplace-operator-79b997595-djpjw\" (UID: \"b7708f77-d399-4d7e-8034-9e043e56aabe\") " pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.034508 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b7708f77-d399-4d7e-8034-9e043e56aabe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-djpjw\" (UID: \"b7708f77-d399-4d7e-8034-9e043e56aabe\") " pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.040308 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7708f77-d399-4d7e-8034-9e043e56aabe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-djpjw\" (UID: \"b7708f77-d399-4d7e-8034-9e043e56aabe\") " pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.048012 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b7708f77-d399-4d7e-8034-9e043e56aabe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-djpjw\" (UID: \"b7708f77-d399-4d7e-8034-9e043e56aabe\") " pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.055646 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tgfx\" (UniqueName: \"kubernetes.io/projected/b7708f77-d399-4d7e-8034-9e043e56aabe-kube-api-access-7tgfx\") pod \"marketplace-operator-79b997595-djpjw\" (UID: \"b7708f77-d399-4d7e-8034-9e043e56aabe\") " pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.164988 4885 generic.go:334] "Generic (PLEG): container finished" podID="61122263-3d9d-4510-87bc-6e8ff3bf7af5" containerID="43278e7cf72b3bded009173d27e3c91220d6255b10cd3f3a86db991ad544fe57" exitCode=0 Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.165065 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g2dmb" event={"ID":"61122263-3d9d-4510-87bc-6e8ff3bf7af5","Type":"ContainerDied","Data":"43278e7cf72b3bded009173d27e3c91220d6255b10cd3f3a86db991ad544fe57"} Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.166480 4885 generic.go:334] "Generic (PLEG): container finished" podID="61f67e39-acf3-4ec4-af3f-68159973345e" containerID="bfc8f52385ce855919cfb4fb6c9b1f526b1f4c727797bec0c95fb7548b8a87ec" exitCode=0 Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.166523 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28gdb" event={"ID":"61f67e39-acf3-4ec4-af3f-68159973345e","Type":"ContainerDied","Data":"bfc8f52385ce855919cfb4fb6c9b1f526b1f4c727797bec0c95fb7548b8a87ec"} Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.176402 4885 generic.go:334] "Generic (PLEG): container finished" podID="746679e1-b958-4320-bc6c-00060a83db3f" containerID="61990de48711f3bac0a406b6c1f3c6f0caeeb6c924675330235b61763540d3f7" exitCode=0 Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.176481 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24244" event={"ID":"746679e1-b958-4320-bc6c-00060a83db3f","Type":"ContainerDied","Data":"61990de48711f3bac0a406b6c1f3c6f0caeeb6c924675330235b61763540d3f7"} Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.192404 4885 generic.go:334] "Generic (PLEG): container finished" podID="106ffd61-239f-4707-b999-aa044f6f30ae" containerID="efadd20e9f956c6cca8f25167f14ab63bfcd7f50f8c4d5b6d6a10248e0b3f634" exitCode=0 Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.192485 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" event={"ID":"106ffd61-239f-4707-b999-aa044f6f30ae","Type":"ContainerDied","Data":"efadd20e9f956c6cca8f25167f14ab63bfcd7f50f8c4d5b6d6a10248e0b3f634"} Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.192524 4885 scope.go:117] "RemoveContainer" containerID="8e1392383c19bfc5439cf6a03b16f4e7128a7e48f79ec146434f29359e401e0f" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.196394 4885 generic.go:334] "Generic (PLEG): container finished" podID="60a86faf-f4ad-4e5a-b614-4c90d228b05f" containerID="df925bd36fa538dc6f19a012f9f9aa71e667ca946f5b1a0343d9d29ed41bad59" exitCode=0 Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.196430 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9hmw" event={"ID":"60a86faf-f4ad-4e5a-b614-4c90d228b05f","Type":"ContainerDied","Data":"df925bd36fa538dc6f19a012f9f9aa71e667ca946f5b1a0343d9d29ed41bad59"} Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.217899 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.361855 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.440433 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f67e39-acf3-4ec4-af3f-68159973345e-catalog-content\") pod \"61f67e39-acf3-4ec4-af3f-68159973345e\" (UID: \"61f67e39-acf3-4ec4-af3f-68159973345e\") " Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.440895 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f67e39-acf3-4ec4-af3f-68159973345e-utilities\") pod \"61f67e39-acf3-4ec4-af3f-68159973345e\" (UID: \"61f67e39-acf3-4ec4-af3f-68159973345e\") " Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.440922 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qlr4\" (UniqueName: \"kubernetes.io/projected/61f67e39-acf3-4ec4-af3f-68159973345e-kube-api-access-5qlr4\") pod \"61f67e39-acf3-4ec4-af3f-68159973345e\" (UID: \"61f67e39-acf3-4ec4-af3f-68159973345e\") " Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.443149 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61f67e39-acf3-4ec4-af3f-68159973345e-utilities" (OuterVolumeSpecName: "utilities") pod "61f67e39-acf3-4ec4-af3f-68159973345e" (UID: "61f67e39-acf3-4ec4-af3f-68159973345e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.454695 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f67e39-acf3-4ec4-af3f-68159973345e-kube-api-access-5qlr4" (OuterVolumeSpecName: "kube-api-access-5qlr4") pod "61f67e39-acf3-4ec4-af3f-68159973345e" (UID: "61f67e39-acf3-4ec4-af3f-68159973345e"). InnerVolumeSpecName "kube-api-access-5qlr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.504781 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61f67e39-acf3-4ec4-af3f-68159973345e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61f67e39-acf3-4ec4-af3f-68159973345e" (UID: "61f67e39-acf3-4ec4-af3f-68159973345e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.570935 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f67e39-acf3-4ec4-af3f-68159973345e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.570968 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f67e39-acf3-4ec4-af3f-68159973345e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.570978 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qlr4\" (UniqueName: \"kubernetes.io/projected/61f67e39-acf3-4ec4-af3f-68159973345e-kube-api-access-5qlr4\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.597409 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.604472 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.606298 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.611093 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24244" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.772943 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/106ffd61-239f-4707-b999-aa044f6f30ae-marketplace-trusted-ca\") pod \"106ffd61-239f-4707-b999-aa044f6f30ae\" (UID: \"106ffd61-239f-4707-b999-aa044f6f30ae\") " Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.773006 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt28l\" (UniqueName: \"kubernetes.io/projected/60a86faf-f4ad-4e5a-b614-4c90d228b05f-kube-api-access-qt28l\") pod \"60a86faf-f4ad-4e5a-b614-4c90d228b05f\" (UID: \"60a86faf-f4ad-4e5a-b614-4c90d228b05f\") " Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.773098 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tq47\" (UniqueName: \"kubernetes.io/projected/746679e1-b958-4320-bc6c-00060a83db3f-kube-api-access-9tq47\") pod \"746679e1-b958-4320-bc6c-00060a83db3f\" (UID: \"746679e1-b958-4320-bc6c-00060a83db3f\") " Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.773152 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/746679e1-b958-4320-bc6c-00060a83db3f-utilities\") pod \"746679e1-b958-4320-bc6c-00060a83db3f\" (UID: \"746679e1-b958-4320-bc6c-00060a83db3f\") " Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.773189 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl9nk\" (UniqueName: \"kubernetes.io/projected/106ffd61-239f-4707-b999-aa044f6f30ae-kube-api-access-fl9nk\") pod \"106ffd61-239f-4707-b999-aa044f6f30ae\" (UID: \"106ffd61-239f-4707-b999-aa044f6f30ae\") " Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.773224 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61122263-3d9d-4510-87bc-6e8ff3bf7af5-utilities\") pod \"61122263-3d9d-4510-87bc-6e8ff3bf7af5\" (UID: \"61122263-3d9d-4510-87bc-6e8ff3bf7af5\") " Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.773335 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/746679e1-b958-4320-bc6c-00060a83db3f-catalog-content\") pod \"746679e1-b958-4320-bc6c-00060a83db3f\" (UID: \"746679e1-b958-4320-bc6c-00060a83db3f\") " Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.773401 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/106ffd61-239f-4707-b999-aa044f6f30ae-marketplace-operator-metrics\") pod \"106ffd61-239f-4707-b999-aa044f6f30ae\" (UID: \"106ffd61-239f-4707-b999-aa044f6f30ae\") " Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.773452 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a86faf-f4ad-4e5a-b614-4c90d228b05f-catalog-content\") pod \"60a86faf-f4ad-4e5a-b614-4c90d228b05f\" (UID: \"60a86faf-f4ad-4e5a-b614-4c90d228b05f\") " Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.773487 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a86faf-f4ad-4e5a-b614-4c90d228b05f-utilities\") pod \"60a86faf-f4ad-4e5a-b614-4c90d228b05f\" (UID: \"60a86faf-f4ad-4e5a-b614-4c90d228b05f\") " Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.773519 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61122263-3d9d-4510-87bc-6e8ff3bf7af5-catalog-content\") pod \"61122263-3d9d-4510-87bc-6e8ff3bf7af5\" (UID: \"61122263-3d9d-4510-87bc-6e8ff3bf7af5\") " Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.773579 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4c8l\" (UniqueName: \"kubernetes.io/projected/61122263-3d9d-4510-87bc-6e8ff3bf7af5-kube-api-access-g4c8l\") pod \"61122263-3d9d-4510-87bc-6e8ff3bf7af5\" (UID: \"61122263-3d9d-4510-87bc-6e8ff3bf7af5\") " Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.773654 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/106ffd61-239f-4707-b999-aa044f6f30ae-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "106ffd61-239f-4707-b999-aa044f6f30ae" (UID: "106ffd61-239f-4707-b999-aa044f6f30ae"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.773794 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/746679e1-b958-4320-bc6c-00060a83db3f-utilities" (OuterVolumeSpecName: "utilities") pod "746679e1-b958-4320-bc6c-00060a83db3f" (UID: "746679e1-b958-4320-bc6c-00060a83db3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.774156 4885 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/106ffd61-239f-4707-b999-aa044f6f30ae-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.774205 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/746679e1-b958-4320-bc6c-00060a83db3f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.774887 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60a86faf-f4ad-4e5a-b614-4c90d228b05f-utilities" (OuterVolumeSpecName: "utilities") pod "60a86faf-f4ad-4e5a-b614-4c90d228b05f" (UID: "60a86faf-f4ad-4e5a-b614-4c90d228b05f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.775954 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60a86faf-f4ad-4e5a-b614-4c90d228b05f-kube-api-access-qt28l" (OuterVolumeSpecName: "kube-api-access-qt28l") pod "60a86faf-f4ad-4e5a-b614-4c90d228b05f" (UID: "60a86faf-f4ad-4e5a-b614-4c90d228b05f"). InnerVolumeSpecName "kube-api-access-qt28l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.776392 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61122263-3d9d-4510-87bc-6e8ff3bf7af5-utilities" (OuterVolumeSpecName: "utilities") pod "61122263-3d9d-4510-87bc-6e8ff3bf7af5" (UID: "61122263-3d9d-4510-87bc-6e8ff3bf7af5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.776400 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106ffd61-239f-4707-b999-aa044f6f30ae-kube-api-access-fl9nk" (OuterVolumeSpecName: "kube-api-access-fl9nk") pod "106ffd61-239f-4707-b999-aa044f6f30ae" (UID: "106ffd61-239f-4707-b999-aa044f6f30ae"). InnerVolumeSpecName "kube-api-access-fl9nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.776752 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106ffd61-239f-4707-b999-aa044f6f30ae-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "106ffd61-239f-4707-b999-aa044f6f30ae" (UID: "106ffd61-239f-4707-b999-aa044f6f30ae"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.778082 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61122263-3d9d-4510-87bc-6e8ff3bf7af5-kube-api-access-g4c8l" (OuterVolumeSpecName: "kube-api-access-g4c8l") pod "61122263-3d9d-4510-87bc-6e8ff3bf7af5" (UID: "61122263-3d9d-4510-87bc-6e8ff3bf7af5"). InnerVolumeSpecName "kube-api-access-g4c8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.788357 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/746679e1-b958-4320-bc6c-00060a83db3f-kube-api-access-9tq47" (OuterVolumeSpecName: "kube-api-access-9tq47") pod "746679e1-b958-4320-bc6c-00060a83db3f" (UID: "746679e1-b958-4320-bc6c-00060a83db3f"). InnerVolumeSpecName "kube-api-access-9tq47". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.796874 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61122263-3d9d-4510-87bc-6e8ff3bf7af5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61122263-3d9d-4510-87bc-6e8ff3bf7af5" (UID: "61122263-3d9d-4510-87bc-6e8ff3bf7af5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.830921 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/746679e1-b958-4320-bc6c-00060a83db3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "746679e1-b958-4320-bc6c-00060a83db3f" (UID: "746679e1-b958-4320-bc6c-00060a83db3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.838416 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-djpjw"] Dec 05 20:12:02 crc kubenswrapper[4885]: W1205 20:12:02.840913 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7708f77_d399_4d7e_8034_9e043e56aabe.slice/crio-4fe70b8ba77a2a274f021a0a82a6e0e6163b4018652a04057ad6f692827d32a4 WatchSource:0}: Error finding container 4fe70b8ba77a2a274f021a0a82a6e0e6163b4018652a04057ad6f692827d32a4: Status 404 returned error can't find the container with id 4fe70b8ba77a2a274f021a0a82a6e0e6163b4018652a04057ad6f692827d32a4 Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.875832 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt28l\" (UniqueName: \"kubernetes.io/projected/60a86faf-f4ad-4e5a-b614-4c90d228b05f-kube-api-access-qt28l\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.875870 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tq47\" (UniqueName: \"kubernetes.io/projected/746679e1-b958-4320-bc6c-00060a83db3f-kube-api-access-9tq47\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.875882 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl9nk\" (UniqueName: \"kubernetes.io/projected/106ffd61-239f-4707-b999-aa044f6f30ae-kube-api-access-fl9nk\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.875895 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61122263-3d9d-4510-87bc-6e8ff3bf7af5-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.875907 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/746679e1-b958-4320-bc6c-00060a83db3f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.875919 4885 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/106ffd61-239f-4707-b999-aa044f6f30ae-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.875929 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a86faf-f4ad-4e5a-b614-4c90d228b05f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.875940 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61122263-3d9d-4510-87bc-6e8ff3bf7af5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.875950 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4c8l\" (UniqueName: \"kubernetes.io/projected/61122263-3d9d-4510-87bc-6e8ff3bf7af5-kube-api-access-g4c8l\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.900507 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60a86faf-f4ad-4e5a-b614-4c90d228b05f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60a86faf-f4ad-4e5a-b614-4c90d228b05f" (UID: "60a86faf-f4ad-4e5a-b614-4c90d228b05f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:02 crc kubenswrapper[4885]: I1205 20:12:02.976402 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a86faf-f4ad-4e5a-b614-4c90d228b05f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.218928 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24244" event={"ID":"746679e1-b958-4320-bc6c-00060a83db3f","Type":"ContainerDied","Data":"f89da8872abaa119890bed0b796f6f54beaa63515ff2798b8bf2cb8b22d8ee9c"} Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.219124 4885 scope.go:117] "RemoveContainer" containerID="61990de48711f3bac0a406b6c1f3c6f0caeeb6c924675330235b61763540d3f7" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.219014 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24244" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.225995 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" event={"ID":"106ffd61-239f-4707-b999-aa044f6f30ae","Type":"ContainerDied","Data":"ed6590a8bfa567fa8b50c10f01cc6923217488f41884aac615b8fbcd5d38a46f"} Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.226522 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.235320 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9hmw" event={"ID":"60a86faf-f4ad-4e5a-b614-4c90d228b05f","Type":"ContainerDied","Data":"4f4480e98b75cb5cbdb27f8642ae40d8686d7f6b8b68ec3338fc34cd6da8161c"} Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.235405 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9hmw" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.238566 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" event={"ID":"b7708f77-d399-4d7e-8034-9e043e56aabe","Type":"ContainerStarted","Data":"dea1ccc3e4ba42204970caf7318022b0acc429d3124c24767fb333bb77fbfad7"} Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.238603 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" event={"ID":"b7708f77-d399-4d7e-8034-9e043e56aabe","Type":"ContainerStarted","Data":"4fe70b8ba77a2a274f021a0a82a6e0e6163b4018652a04057ad6f692827d32a4"} Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.238758 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.242035 4885 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-djpjw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" start-of-body= Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.242089 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" podUID="b7708f77-d399-4d7e-8034-9e043e56aabe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.255501 4885 scope.go:117] "RemoveContainer" containerID="0c22501379994176a2e773e608d6442d78718e254ec8d0ce1fb34e2aec652438" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.255635 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-24244"] Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.260309 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g2dmb" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.260576 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g2dmb" event={"ID":"61122263-3d9d-4510-87bc-6e8ff3bf7af5","Type":"ContainerDied","Data":"f21e60121141644d8e9589e406844a3ec296032ed41ac770cc5ae1d5280ca5d8"} Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.266893 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28gdb" event={"ID":"61f67e39-acf3-4ec4-af3f-68159973345e","Type":"ContainerDied","Data":"7d8ca4c12a9bf5340671faec319181cc1e7b0983146e63a473a9acf4d7700985"} Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.266987 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28gdb" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.268321 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-24244"] Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.272046 4885 scope.go:117] "RemoveContainer" containerID="dc3c2c43f22e9bf622a4c01c331da8efb70e48d3624e9e9f042add3a80fd3256" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.273385 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n7qfd"] Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.293691 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n7qfd"] Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.296260 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" podStartSLOduration=2.29624504 podStartE2EDuration="2.29624504s" podCreationTimestamp="2025-12-05 20:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:03.282144116 +0000 UTC m=+388.578959777" watchObservedRunningTime="2025-12-05 20:12:03.29624504 +0000 UTC m=+388.593060701" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.301208 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n9hmw"] Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.301482 4885 scope.go:117] "RemoveContainer" containerID="efadd20e9f956c6cca8f25167f14ab63bfcd7f50f8c4d5b6d6a10248e0b3f634" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.304307 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n9hmw"] Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.310459 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-28gdb"] Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.313512 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-28gdb"] Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.317040 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g2dmb"] Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.320442 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g2dmb"] Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.325910 4885 scope.go:117] "RemoveContainer" containerID="df925bd36fa538dc6f19a012f9f9aa71e667ca946f5b1a0343d9d29ed41bad59" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.343292 4885 scope.go:117] "RemoveContainer" containerID="200b550d4c45d9e5a5409b99ac1a7953b120128e3baa8c05f6cf354052fabe35" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.356530 4885 scope.go:117] "RemoveContainer" containerID="d71c08fc7b39db1040860f1e8918187d7756db4c522fc57f9ed3d4cecca47a5e" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.379183 4885 scope.go:117] "RemoveContainer" containerID="43278e7cf72b3bded009173d27e3c91220d6255b10cd3f3a86db991ad544fe57" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.391701 4885 scope.go:117] "RemoveContainer" containerID="c331e665a7a095e20adb4e3021499753e7fc7fb06f5a89a70e4fee9c622dcde0" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.405323 4885 scope.go:117] "RemoveContainer" containerID="e839585263d166d270cc23b043c285f2a3b93e6e49d003877c9048e6d915008f" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.422844 4885 scope.go:117] "RemoveContainer" containerID="bfc8f52385ce855919cfb4fb6c9b1f526b1f4c727797bec0c95fb7548b8a87ec" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.434869 4885 scope.go:117] "RemoveContainer" containerID="d236f500f04d3ebfcaf7b810d1d2049c27a683255308bc63be3dbf348f17e89f" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.446737 4885 scope.go:117] "RemoveContainer" containerID="9b484f4d1df1bfad4ba36b2ca3192b70c35da836f76d7ca70153a71bfb2d75b7" Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.477771 4885 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-n7qfd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 20:12:03 crc kubenswrapper[4885]: I1205 20:12:03.477836 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-n7qfd" podUID="106ffd61-239f-4707-b999-aa044f6f30ae" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.053541 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lv47n"] Dec 05 20:12:04 crc kubenswrapper[4885]: E1205 20:12:04.053774 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f67e39-acf3-4ec4-af3f-68159973345e" containerName="registry-server" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.053788 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f67e39-acf3-4ec4-af3f-68159973345e" containerName="registry-server" Dec 05 20:12:04 crc kubenswrapper[4885]: E1205 20:12:04.053800 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="746679e1-b958-4320-bc6c-00060a83db3f" containerName="extract-content" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.053809 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="746679e1-b958-4320-bc6c-00060a83db3f" containerName="extract-content" Dec 05 20:12:04 crc kubenswrapper[4885]: E1205 20:12:04.053822 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a86faf-f4ad-4e5a-b614-4c90d228b05f" containerName="extract-utilities" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.053829 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a86faf-f4ad-4e5a-b614-4c90d228b05f" containerName="extract-utilities" Dec 05 20:12:04 crc kubenswrapper[4885]: E1205 20:12:04.053840 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106ffd61-239f-4707-b999-aa044f6f30ae" containerName="marketplace-operator" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.053848 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="106ffd61-239f-4707-b999-aa044f6f30ae" containerName="marketplace-operator" Dec 05 20:12:04 crc kubenswrapper[4885]: E1205 20:12:04.053857 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61122263-3d9d-4510-87bc-6e8ff3bf7af5" containerName="registry-server" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.053864 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="61122263-3d9d-4510-87bc-6e8ff3bf7af5" containerName="registry-server" Dec 05 20:12:04 crc kubenswrapper[4885]: E1205 20:12:04.053874 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f67e39-acf3-4ec4-af3f-68159973345e" containerName="extract-utilities" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.053881 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f67e39-acf3-4ec4-af3f-68159973345e" containerName="extract-utilities" Dec 05 20:12:04 crc kubenswrapper[4885]: E1205 20:12:04.053890 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="746679e1-b958-4320-bc6c-00060a83db3f" containerName="extract-utilities" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.053897 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="746679e1-b958-4320-bc6c-00060a83db3f" containerName="extract-utilities" Dec 05 20:12:04 crc kubenswrapper[4885]: E1205 20:12:04.053908 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61122263-3d9d-4510-87bc-6e8ff3bf7af5" containerName="extract-utilities" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.053915 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="61122263-3d9d-4510-87bc-6e8ff3bf7af5" containerName="extract-utilities" Dec 05 20:12:04 crc kubenswrapper[4885]: E1205 20:12:04.053926 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f67e39-acf3-4ec4-af3f-68159973345e" containerName="extract-content" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.053933 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f67e39-acf3-4ec4-af3f-68159973345e" containerName="extract-content" Dec 05 20:12:04 crc kubenswrapper[4885]: E1205 20:12:04.053941 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a86faf-f4ad-4e5a-b614-4c90d228b05f" containerName="extract-content" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.053950 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a86faf-f4ad-4e5a-b614-4c90d228b05f" containerName="extract-content" Dec 05 20:12:04 crc kubenswrapper[4885]: E1205 20:12:04.053957 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a86faf-f4ad-4e5a-b614-4c90d228b05f" containerName="registry-server" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.053964 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a86faf-f4ad-4e5a-b614-4c90d228b05f" containerName="registry-server" Dec 05 20:12:04 crc kubenswrapper[4885]: E1205 20:12:04.053974 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106ffd61-239f-4707-b999-aa044f6f30ae" containerName="marketplace-operator" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.053980 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="106ffd61-239f-4707-b999-aa044f6f30ae" containerName="marketplace-operator" Dec 05 20:12:04 crc kubenswrapper[4885]: E1205 20:12:04.053994 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61122263-3d9d-4510-87bc-6e8ff3bf7af5" containerName="extract-content" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.054001 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="61122263-3d9d-4510-87bc-6e8ff3bf7af5" containerName="extract-content" Dec 05 20:12:04 crc kubenswrapper[4885]: E1205 20:12:04.054011 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="746679e1-b958-4320-bc6c-00060a83db3f" containerName="registry-server" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.054017 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="746679e1-b958-4320-bc6c-00060a83db3f" containerName="registry-server" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.054129 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="61122263-3d9d-4510-87bc-6e8ff3bf7af5" containerName="registry-server" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.054139 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a86faf-f4ad-4e5a-b614-4c90d228b05f" containerName="registry-server" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.054149 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="106ffd61-239f-4707-b999-aa044f6f30ae" containerName="marketplace-operator" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.054161 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="106ffd61-239f-4707-b999-aa044f6f30ae" containerName="marketplace-operator" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.054175 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="746679e1-b958-4320-bc6c-00060a83db3f" containerName="registry-server" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.054185 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f67e39-acf3-4ec4-af3f-68159973345e" containerName="registry-server" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.055056 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.056646 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.065569 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lv47n"] Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.088096 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012f80db-3d51-4336-94d3-9a54c642d7db-utilities\") pod \"certified-operators-lv47n\" (UID: \"012f80db-3d51-4336-94d3-9a54c642d7db\") " pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.088359 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dddtw\" (UniqueName: \"kubernetes.io/projected/012f80db-3d51-4336-94d3-9a54c642d7db-kube-api-access-dddtw\") pod \"certified-operators-lv47n\" (UID: \"012f80db-3d51-4336-94d3-9a54c642d7db\") " pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.088501 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012f80db-3d51-4336-94d3-9a54c642d7db-catalog-content\") pod \"certified-operators-lv47n\" (UID: \"012f80db-3d51-4336-94d3-9a54c642d7db\") " pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.189418 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012f80db-3d51-4336-94d3-9a54c642d7db-utilities\") pod \"certified-operators-lv47n\" (UID: \"012f80db-3d51-4336-94d3-9a54c642d7db\") " pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.189837 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dddtw\" (UniqueName: \"kubernetes.io/projected/012f80db-3d51-4336-94d3-9a54c642d7db-kube-api-access-dddtw\") pod \"certified-operators-lv47n\" (UID: \"012f80db-3d51-4336-94d3-9a54c642d7db\") " pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.189863 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012f80db-3d51-4336-94d3-9a54c642d7db-catalog-content\") pod \"certified-operators-lv47n\" (UID: \"012f80db-3d51-4336-94d3-9a54c642d7db\") " pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.189924 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012f80db-3d51-4336-94d3-9a54c642d7db-utilities\") pod \"certified-operators-lv47n\" (UID: \"012f80db-3d51-4336-94d3-9a54c642d7db\") " pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.190169 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012f80db-3d51-4336-94d3-9a54c642d7db-catalog-content\") pod \"certified-operators-lv47n\" (UID: \"012f80db-3d51-4336-94d3-9a54c642d7db\") " pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.211295 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dddtw\" (UniqueName: \"kubernetes.io/projected/012f80db-3d51-4336-94d3-9a54c642d7db-kube-api-access-dddtw\") pod \"certified-operators-lv47n\" (UID: \"012f80db-3d51-4336-94d3-9a54c642d7db\") " pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.254052 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tnv28"] Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.254980 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tnv28" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.258409 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.266750 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tnv28"] Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.289309 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-djpjw" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.290950 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd9a5eba-660b-489b-b9f8-3a5366d313c9-catalog-content\") pod \"redhat-marketplace-tnv28\" (UID: \"fd9a5eba-660b-489b-b9f8-3a5366d313c9\") " pod="openshift-marketplace/redhat-marketplace-tnv28" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.291046 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlhgx\" (UniqueName: \"kubernetes.io/projected/fd9a5eba-660b-489b-b9f8-3a5366d313c9-kube-api-access-zlhgx\") pod \"redhat-marketplace-tnv28\" (UID: \"fd9a5eba-660b-489b-b9f8-3a5366d313c9\") " pod="openshift-marketplace/redhat-marketplace-tnv28" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.291315 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd9a5eba-660b-489b-b9f8-3a5366d313c9-utilities\") pod \"redhat-marketplace-tnv28\" (UID: \"fd9a5eba-660b-489b-b9f8-3a5366d313c9\") " pod="openshift-marketplace/redhat-marketplace-tnv28" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.374569 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.392055 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd9a5eba-660b-489b-b9f8-3a5366d313c9-utilities\") pod \"redhat-marketplace-tnv28\" (UID: \"fd9a5eba-660b-489b-b9f8-3a5366d313c9\") " pod="openshift-marketplace/redhat-marketplace-tnv28" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.392161 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd9a5eba-660b-489b-b9f8-3a5366d313c9-catalog-content\") pod \"redhat-marketplace-tnv28\" (UID: \"fd9a5eba-660b-489b-b9f8-3a5366d313c9\") " pod="openshift-marketplace/redhat-marketplace-tnv28" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.392191 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlhgx\" (UniqueName: \"kubernetes.io/projected/fd9a5eba-660b-489b-b9f8-3a5366d313c9-kube-api-access-zlhgx\") pod \"redhat-marketplace-tnv28\" (UID: \"fd9a5eba-660b-489b-b9f8-3a5366d313c9\") " pod="openshift-marketplace/redhat-marketplace-tnv28" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.392703 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd9a5eba-660b-489b-b9f8-3a5366d313c9-utilities\") pod \"redhat-marketplace-tnv28\" (UID: \"fd9a5eba-660b-489b-b9f8-3a5366d313c9\") " pod="openshift-marketplace/redhat-marketplace-tnv28" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.392816 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd9a5eba-660b-489b-b9f8-3a5366d313c9-catalog-content\") pod \"redhat-marketplace-tnv28\" (UID: \"fd9a5eba-660b-489b-b9f8-3a5366d313c9\") " pod="openshift-marketplace/redhat-marketplace-tnv28" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.414012 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlhgx\" (UniqueName: \"kubernetes.io/projected/fd9a5eba-660b-489b-b9f8-3a5366d313c9-kube-api-access-zlhgx\") pod \"redhat-marketplace-tnv28\" (UID: \"fd9a5eba-660b-489b-b9f8-3a5366d313c9\") " pod="openshift-marketplace/redhat-marketplace-tnv28" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.587387 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tnv28" Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.777751 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lv47n"] Dec 05 20:12:04 crc kubenswrapper[4885]: W1205 20:12:04.782529 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod012f80db_3d51_4336_94d3_9a54c642d7db.slice/crio-4836d8c7bd384c48e7aa25f93bf537c4be236a53e1d6e3ee4e5a980af77a6060 WatchSource:0}: Error finding container 4836d8c7bd384c48e7aa25f93bf537c4be236a53e1d6e3ee4e5a980af77a6060: Status 404 returned error can't find the container with id 4836d8c7bd384c48e7aa25f93bf537c4be236a53e1d6e3ee4e5a980af77a6060 Dec 05 20:12:04 crc kubenswrapper[4885]: I1205 20:12:04.974579 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tnv28"] Dec 05 20:12:04 crc kubenswrapper[4885]: W1205 20:12:04.999129 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd9a5eba_660b_489b_b9f8_3a5366d313c9.slice/crio-91a3300b9ba5aa179a4d707e076b32c42a4f1ad8ea2227ef500b11e4d16bb810 WatchSource:0}: Error finding container 91a3300b9ba5aa179a4d707e076b32c42a4f1ad8ea2227ef500b11e4d16bb810: Status 404 returned error can't find the container with id 91a3300b9ba5aa179a4d707e076b32c42a4f1ad8ea2227ef500b11e4d16bb810 Dec 05 20:12:05 crc kubenswrapper[4885]: I1205 20:12:05.196408 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="106ffd61-239f-4707-b999-aa044f6f30ae" path="/var/lib/kubelet/pods/106ffd61-239f-4707-b999-aa044f6f30ae/volumes" Dec 05 20:12:05 crc kubenswrapper[4885]: I1205 20:12:05.200342 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60a86faf-f4ad-4e5a-b614-4c90d228b05f" path="/var/lib/kubelet/pods/60a86faf-f4ad-4e5a-b614-4c90d228b05f/volumes" Dec 05 20:12:05 crc kubenswrapper[4885]: I1205 20:12:05.201013 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61122263-3d9d-4510-87bc-6e8ff3bf7af5" path="/var/lib/kubelet/pods/61122263-3d9d-4510-87bc-6e8ff3bf7af5/volumes" Dec 05 20:12:05 crc kubenswrapper[4885]: I1205 20:12:05.201603 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f67e39-acf3-4ec4-af3f-68159973345e" path="/var/lib/kubelet/pods/61f67e39-acf3-4ec4-af3f-68159973345e/volumes" Dec 05 20:12:05 crc kubenswrapper[4885]: I1205 20:12:05.202783 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="746679e1-b958-4320-bc6c-00060a83db3f" path="/var/lib/kubelet/pods/746679e1-b958-4320-bc6c-00060a83db3f/volumes" Dec 05 20:12:05 crc kubenswrapper[4885]: I1205 20:12:05.291447 4885 generic.go:334] "Generic (PLEG): container finished" podID="fd9a5eba-660b-489b-b9f8-3a5366d313c9" containerID="4d3c77194103b8d2b4ece3e0ed5b7eeff810e9fbaacc9e7bff5c8f453d128ce5" exitCode=0 Dec 05 20:12:05 crc kubenswrapper[4885]: I1205 20:12:05.291485 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tnv28" event={"ID":"fd9a5eba-660b-489b-b9f8-3a5366d313c9","Type":"ContainerDied","Data":"4d3c77194103b8d2b4ece3e0ed5b7eeff810e9fbaacc9e7bff5c8f453d128ce5"} Dec 05 20:12:05 crc kubenswrapper[4885]: I1205 20:12:05.291520 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tnv28" event={"ID":"fd9a5eba-660b-489b-b9f8-3a5366d313c9","Type":"ContainerStarted","Data":"91a3300b9ba5aa179a4d707e076b32c42a4f1ad8ea2227ef500b11e4d16bb810"} Dec 05 20:12:05 crc kubenswrapper[4885]: I1205 20:12:05.292707 4885 generic.go:334] "Generic (PLEG): container finished" podID="012f80db-3d51-4336-94d3-9a54c642d7db" containerID="68917757e92285ce4bbc7aa790427967f4cac016bce8be8a30860ddfe0dbee3e" exitCode=0 Dec 05 20:12:05 crc kubenswrapper[4885]: I1205 20:12:05.293104 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lv47n" event={"ID":"012f80db-3d51-4336-94d3-9a54c642d7db","Type":"ContainerDied","Data":"68917757e92285ce4bbc7aa790427967f4cac016bce8be8a30860ddfe0dbee3e"} Dec 05 20:12:05 crc kubenswrapper[4885]: I1205 20:12:05.293191 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lv47n" event={"ID":"012f80db-3d51-4336-94d3-9a54c642d7db","Type":"ContainerStarted","Data":"4836d8c7bd384c48e7aa25f93bf537c4be236a53e1d6e3ee4e5a980af77a6060"} Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.301639 4885 generic.go:334] "Generic (PLEG): container finished" podID="012f80db-3d51-4336-94d3-9a54c642d7db" containerID="109df54d2d7e7b52eedafc9a35ed820858ef3b7012b65d377b91d5f99680a616" exitCode=0 Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.301747 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lv47n" event={"ID":"012f80db-3d51-4336-94d3-9a54c642d7db","Type":"ContainerDied","Data":"109df54d2d7e7b52eedafc9a35ed820858ef3b7012b65d377b91d5f99680a616"} Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.305621 4885 generic.go:334] "Generic (PLEG): container finished" podID="fd9a5eba-660b-489b-b9f8-3a5366d313c9" containerID="36d9172abf7089cd455a1275c9183799cadc09ccb4ccc367adb560c438e1ab50" exitCode=0 Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.305661 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tnv28" event={"ID":"fd9a5eba-660b-489b-b9f8-3a5366d313c9","Type":"ContainerDied","Data":"36d9172abf7089cd455a1275c9183799cadc09ccb4ccc367adb560c438e1ab50"} Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.464148 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fwl8v"] Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.465492 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fwl8v" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.467780 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.469498 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fwl8v"] Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.625992 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2250db3-b5b2-435f-bd9e-1b599f663d70-catalog-content\") pod \"community-operators-fwl8v\" (UID: \"f2250db3-b5b2-435f-bd9e-1b599f663d70\") " pod="openshift-marketplace/community-operators-fwl8v" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.626047 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2250db3-b5b2-435f-bd9e-1b599f663d70-utilities\") pod \"community-operators-fwl8v\" (UID: \"f2250db3-b5b2-435f-bd9e-1b599f663d70\") " pod="openshift-marketplace/community-operators-fwl8v" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.626102 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22tsh\" (UniqueName: \"kubernetes.io/projected/f2250db3-b5b2-435f-bd9e-1b599f663d70-kube-api-access-22tsh\") pod \"community-operators-fwl8v\" (UID: \"f2250db3-b5b2-435f-bd9e-1b599f663d70\") " pod="openshift-marketplace/community-operators-fwl8v" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.681657 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2wmcd"] Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.682763 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2wmcd" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.686486 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.686948 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2wmcd"] Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.727257 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22tsh\" (UniqueName: \"kubernetes.io/projected/f2250db3-b5b2-435f-bd9e-1b599f663d70-kube-api-access-22tsh\") pod \"community-operators-fwl8v\" (UID: \"f2250db3-b5b2-435f-bd9e-1b599f663d70\") " pod="openshift-marketplace/community-operators-fwl8v" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.727357 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1cc6544-7046-414f-9f36-71801abdfe03-catalog-content\") pod \"redhat-operators-2wmcd\" (UID: \"d1cc6544-7046-414f-9f36-71801abdfe03\") " pod="openshift-marketplace/redhat-operators-2wmcd" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.727424 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2250db3-b5b2-435f-bd9e-1b599f663d70-catalog-content\") pod \"community-operators-fwl8v\" (UID: \"f2250db3-b5b2-435f-bd9e-1b599f663d70\") " pod="openshift-marketplace/community-operators-fwl8v" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.727460 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1cc6544-7046-414f-9f36-71801abdfe03-utilities\") pod \"redhat-operators-2wmcd\" (UID: \"d1cc6544-7046-414f-9f36-71801abdfe03\") " pod="openshift-marketplace/redhat-operators-2wmcd" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.727493 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2250db3-b5b2-435f-bd9e-1b599f663d70-utilities\") pod \"community-operators-fwl8v\" (UID: \"f2250db3-b5b2-435f-bd9e-1b599f663d70\") " pod="openshift-marketplace/community-operators-fwl8v" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.727550 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6lb6\" (UniqueName: \"kubernetes.io/projected/d1cc6544-7046-414f-9f36-71801abdfe03-kube-api-access-b6lb6\") pod \"redhat-operators-2wmcd\" (UID: \"d1cc6544-7046-414f-9f36-71801abdfe03\") " pod="openshift-marketplace/redhat-operators-2wmcd" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.727847 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2250db3-b5b2-435f-bd9e-1b599f663d70-utilities\") pod \"community-operators-fwl8v\" (UID: \"f2250db3-b5b2-435f-bd9e-1b599f663d70\") " pod="openshift-marketplace/community-operators-fwl8v" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.727909 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2250db3-b5b2-435f-bd9e-1b599f663d70-catalog-content\") pod \"community-operators-fwl8v\" (UID: \"f2250db3-b5b2-435f-bd9e-1b599f663d70\") " pod="openshift-marketplace/community-operators-fwl8v" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.748940 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22tsh\" (UniqueName: \"kubernetes.io/projected/f2250db3-b5b2-435f-bd9e-1b599f663d70-kube-api-access-22tsh\") pod \"community-operators-fwl8v\" (UID: \"f2250db3-b5b2-435f-bd9e-1b599f663d70\") " pod="openshift-marketplace/community-operators-fwl8v" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.786894 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fwl8v" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.831212 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1cc6544-7046-414f-9f36-71801abdfe03-utilities\") pod \"redhat-operators-2wmcd\" (UID: \"d1cc6544-7046-414f-9f36-71801abdfe03\") " pod="openshift-marketplace/redhat-operators-2wmcd" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.831286 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6lb6\" (UniqueName: \"kubernetes.io/projected/d1cc6544-7046-414f-9f36-71801abdfe03-kube-api-access-b6lb6\") pod \"redhat-operators-2wmcd\" (UID: \"d1cc6544-7046-414f-9f36-71801abdfe03\") " pod="openshift-marketplace/redhat-operators-2wmcd" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.831340 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1cc6544-7046-414f-9f36-71801abdfe03-catalog-content\") pod \"redhat-operators-2wmcd\" (UID: \"d1cc6544-7046-414f-9f36-71801abdfe03\") " pod="openshift-marketplace/redhat-operators-2wmcd" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.832340 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1cc6544-7046-414f-9f36-71801abdfe03-catalog-content\") pod \"redhat-operators-2wmcd\" (UID: \"d1cc6544-7046-414f-9f36-71801abdfe03\") " pod="openshift-marketplace/redhat-operators-2wmcd" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.832789 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1cc6544-7046-414f-9f36-71801abdfe03-utilities\") pod \"redhat-operators-2wmcd\" (UID: \"d1cc6544-7046-414f-9f36-71801abdfe03\") " pod="openshift-marketplace/redhat-operators-2wmcd" Dec 05 20:12:06 crc kubenswrapper[4885]: I1205 20:12:06.862251 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6lb6\" (UniqueName: \"kubernetes.io/projected/d1cc6544-7046-414f-9f36-71801abdfe03-kube-api-access-b6lb6\") pod \"redhat-operators-2wmcd\" (UID: \"d1cc6544-7046-414f-9f36-71801abdfe03\") " pod="openshift-marketplace/redhat-operators-2wmcd" Dec 05 20:12:07 crc kubenswrapper[4885]: I1205 20:12:07.010826 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2wmcd" Dec 05 20:12:07 crc kubenswrapper[4885]: I1205 20:12:07.232278 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fwl8v"] Dec 05 20:12:07 crc kubenswrapper[4885]: I1205 20:12:07.312128 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lv47n" event={"ID":"012f80db-3d51-4336-94d3-9a54c642d7db","Type":"ContainerStarted","Data":"1b73f8b05ab7e73c9a14dcfdba984c2819e1d2d5015cfc752118b8b9b5f6416c"} Dec 05 20:12:07 crc kubenswrapper[4885]: I1205 20:12:07.315977 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tnv28" event={"ID":"fd9a5eba-660b-489b-b9f8-3a5366d313c9","Type":"ContainerStarted","Data":"399a05a7424e65d0b81a121892ceeb0664536705d10707568ade260bd27d8f61"} Dec 05 20:12:07 crc kubenswrapper[4885]: I1205 20:12:07.321409 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwl8v" event={"ID":"f2250db3-b5b2-435f-bd9e-1b599f663d70","Type":"ContainerStarted","Data":"adb782ff7680436c59a1afcc10c86de1f99179159bc278c0a3234087bd20e326"} Dec 05 20:12:07 crc kubenswrapper[4885]: I1205 20:12:07.329075 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lv47n" podStartSLOduration=1.9047283990000001 podStartE2EDuration="3.329059014s" podCreationTimestamp="2025-12-05 20:12:04 +0000 UTC" firstStartedPulling="2025-12-05 20:12:05.294127412 +0000 UTC m=+390.590943073" lastFinishedPulling="2025-12-05 20:12:06.718458027 +0000 UTC m=+392.015273688" observedRunningTime="2025-12-05 20:12:07.327668003 +0000 UTC m=+392.624483664" watchObservedRunningTime="2025-12-05 20:12:07.329059014 +0000 UTC m=+392.625874675" Dec 05 20:12:07 crc kubenswrapper[4885]: I1205 20:12:07.344990 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tnv28" podStartSLOduration=1.863129566 podStartE2EDuration="3.34497171s" podCreationTimestamp="2025-12-05 20:12:04 +0000 UTC" firstStartedPulling="2025-12-05 20:12:05.29374046 +0000 UTC m=+390.590556121" lastFinishedPulling="2025-12-05 20:12:06.775582604 +0000 UTC m=+392.072398265" observedRunningTime="2025-12-05 20:12:07.343577819 +0000 UTC m=+392.640393500" watchObservedRunningTime="2025-12-05 20:12:07.34497171 +0000 UTC m=+392.641787371" Dec 05 20:12:07 crc kubenswrapper[4885]: I1205 20:12:07.398894 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2wmcd"] Dec 05 20:12:07 crc kubenswrapper[4885]: W1205 20:12:07.443547 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1cc6544_7046_414f_9f36_71801abdfe03.slice/crio-4b82fd52a5576320e32a137f4cc81cfd74ecc6933b65be6c8352e544a6a329e8 WatchSource:0}: Error finding container 4b82fd52a5576320e32a137f4cc81cfd74ecc6933b65be6c8352e544a6a329e8: Status 404 returned error can't find the container with id 4b82fd52a5576320e32a137f4cc81cfd74ecc6933b65be6c8352e544a6a329e8 Dec 05 20:12:08 crc kubenswrapper[4885]: I1205 20:12:08.330202 4885 generic.go:334] "Generic (PLEG): container finished" podID="d1cc6544-7046-414f-9f36-71801abdfe03" containerID="6ff5d3e0873cb79ed234b46ccc8deafa6332dcf5ce1197423757faf42367403b" exitCode=0 Dec 05 20:12:08 crc kubenswrapper[4885]: I1205 20:12:08.330303 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wmcd" event={"ID":"d1cc6544-7046-414f-9f36-71801abdfe03","Type":"ContainerDied","Data":"6ff5d3e0873cb79ed234b46ccc8deafa6332dcf5ce1197423757faf42367403b"} Dec 05 20:12:08 crc kubenswrapper[4885]: I1205 20:12:08.330641 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wmcd" event={"ID":"d1cc6544-7046-414f-9f36-71801abdfe03","Type":"ContainerStarted","Data":"4b82fd52a5576320e32a137f4cc81cfd74ecc6933b65be6c8352e544a6a329e8"} Dec 05 20:12:08 crc kubenswrapper[4885]: I1205 20:12:08.333987 4885 generic.go:334] "Generic (PLEG): container finished" podID="f2250db3-b5b2-435f-bd9e-1b599f663d70" containerID="cb06864392e247c9dab90df6d695d77200b01b982258dca042ccee7af444e38c" exitCode=0 Dec 05 20:12:08 crc kubenswrapper[4885]: I1205 20:12:08.334192 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwl8v" event={"ID":"f2250db3-b5b2-435f-bd9e-1b599f663d70","Type":"ContainerDied","Data":"cb06864392e247c9dab90df6d695d77200b01b982258dca042ccee7af444e38c"} Dec 05 20:12:09 crc kubenswrapper[4885]: I1205 20:12:09.342441 4885 generic.go:334] "Generic (PLEG): container finished" podID="f2250db3-b5b2-435f-bd9e-1b599f663d70" containerID="846fda7447984b1656ef6075e73333cb3ff4d77426279821140018a2d39e1ff1" exitCode=0 Dec 05 20:12:09 crc kubenswrapper[4885]: I1205 20:12:09.342514 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwl8v" event={"ID":"f2250db3-b5b2-435f-bd9e-1b599f663d70","Type":"ContainerDied","Data":"846fda7447984b1656ef6075e73333cb3ff4d77426279821140018a2d39e1ff1"} Dec 05 20:12:09 crc kubenswrapper[4885]: I1205 20:12:09.347334 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wmcd" event={"ID":"d1cc6544-7046-414f-9f36-71801abdfe03","Type":"ContainerStarted","Data":"77b47f2892b08882a08e31a356a2dc556d3a9250bf4a9daa4be720466864f34f"} Dec 05 20:12:10 crc kubenswrapper[4885]: I1205 20:12:10.355653 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwl8v" event={"ID":"f2250db3-b5b2-435f-bd9e-1b599f663d70","Type":"ContainerStarted","Data":"79c41c4246d0ed2876622e3d46c90b5b438998230eeaaf78a4ac97528755d5a2"} Dec 05 20:12:10 crc kubenswrapper[4885]: I1205 20:12:10.358110 4885 generic.go:334] "Generic (PLEG): container finished" podID="d1cc6544-7046-414f-9f36-71801abdfe03" containerID="77b47f2892b08882a08e31a356a2dc556d3a9250bf4a9daa4be720466864f34f" exitCode=0 Dec 05 20:12:10 crc kubenswrapper[4885]: I1205 20:12:10.358429 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wmcd" event={"ID":"d1cc6544-7046-414f-9f36-71801abdfe03","Type":"ContainerDied","Data":"77b47f2892b08882a08e31a356a2dc556d3a9250bf4a9daa4be720466864f34f"} Dec 05 20:12:10 crc kubenswrapper[4885]: I1205 20:12:10.375707 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fwl8v" podStartSLOduration=2.994437872 podStartE2EDuration="4.37569012s" podCreationTimestamp="2025-12-05 20:12:06 +0000 UTC" firstStartedPulling="2025-12-05 20:12:08.336111969 +0000 UTC m=+393.632927640" lastFinishedPulling="2025-12-05 20:12:09.717364227 +0000 UTC m=+395.014179888" observedRunningTime="2025-12-05 20:12:10.373575727 +0000 UTC m=+395.670391388" watchObservedRunningTime="2025-12-05 20:12:10.37569012 +0000 UTC m=+395.672505781" Dec 05 20:12:11 crc kubenswrapper[4885]: I1205 20:12:11.344377 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7gkh4" Dec 05 20:12:11 crc kubenswrapper[4885]: I1205 20:12:11.390642 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bctth"] Dec 05 20:12:12 crc kubenswrapper[4885]: I1205 20:12:12.371823 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wmcd" event={"ID":"d1cc6544-7046-414f-9f36-71801abdfe03","Type":"ContainerStarted","Data":"de2d38596e99553bbcea4e1ed3bda633aaffd9aa11ec304424eb44371321c97f"} Dec 05 20:12:12 crc kubenswrapper[4885]: I1205 20:12:12.388724 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2wmcd" podStartSLOduration=3.8441321999999998 podStartE2EDuration="6.388702557s" podCreationTimestamp="2025-12-05 20:12:06 +0000 UTC" firstStartedPulling="2025-12-05 20:12:08.332737999 +0000 UTC m=+393.629553680" lastFinishedPulling="2025-12-05 20:12:10.877308376 +0000 UTC m=+396.174124037" observedRunningTime="2025-12-05 20:12:12.385638955 +0000 UTC m=+397.682454616" watchObservedRunningTime="2025-12-05 20:12:12.388702557 +0000 UTC m=+397.685518238" Dec 05 20:12:14 crc kubenswrapper[4885]: I1205 20:12:14.374970 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:12:14 crc kubenswrapper[4885]: I1205 20:12:14.377541 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:12:14 crc kubenswrapper[4885]: I1205 20:12:14.434608 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:12:14 crc kubenswrapper[4885]: I1205 20:12:14.587938 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tnv28" Dec 05 20:12:14 crc kubenswrapper[4885]: I1205 20:12:14.587996 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tnv28" Dec 05 20:12:14 crc kubenswrapper[4885]: I1205 20:12:14.637569 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tnv28" Dec 05 20:12:15 crc kubenswrapper[4885]: I1205 20:12:15.442367 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:12:15 crc kubenswrapper[4885]: I1205 20:12:15.452178 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tnv28" Dec 05 20:12:16 crc kubenswrapper[4885]: I1205 20:12:16.631616 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:12:16 crc kubenswrapper[4885]: I1205 20:12:16.631691 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:12:16 crc kubenswrapper[4885]: I1205 20:12:16.787534 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fwl8v" Dec 05 20:12:16 crc kubenswrapper[4885]: I1205 20:12:16.787691 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fwl8v" Dec 05 20:12:16 crc kubenswrapper[4885]: I1205 20:12:16.827440 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fwl8v" Dec 05 20:12:17 crc kubenswrapper[4885]: I1205 20:12:17.011811 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2wmcd" Dec 05 20:12:17 crc kubenswrapper[4885]: I1205 20:12:17.011891 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2wmcd" Dec 05 20:12:17 crc kubenswrapper[4885]: I1205 20:12:17.439172 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fwl8v" Dec 05 20:12:18 crc kubenswrapper[4885]: I1205 20:12:18.052202 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2wmcd" podUID="d1cc6544-7046-414f-9f36-71801abdfe03" containerName="registry-server" probeResult="failure" output=< Dec 05 20:12:18 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Dec 05 20:12:18 crc kubenswrapper[4885]: > Dec 05 20:12:27 crc kubenswrapper[4885]: I1205 20:12:27.051002 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2wmcd" Dec 05 20:12:27 crc kubenswrapper[4885]: I1205 20:12:27.100208 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2wmcd" Dec 05 20:12:36 crc kubenswrapper[4885]: I1205 20:12:36.451249 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bctth" podUID="c98724fc-908e-4a61-bb2b-905c0f5709a5" containerName="registry" containerID="cri-o://3e57ef77fa8f3ece6249172da68606887bea5cd1584954260ea420ff9591b6ba" gracePeriod=30 Dec 05 20:12:36 crc kubenswrapper[4885]: I1205 20:12:36.907553 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.049662 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-registry-tls\") pod \"c98724fc-908e-4a61-bb2b-905c0f5709a5\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.049908 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"c98724fc-908e-4a61-bb2b-905c0f5709a5\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.049934 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c98724fc-908e-4a61-bb2b-905c0f5709a5-installation-pull-secrets\") pod \"c98724fc-908e-4a61-bb2b-905c0f5709a5\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.049992 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-bound-sa-token\") pod \"c98724fc-908e-4a61-bb2b-905c0f5709a5\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.050011 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c98724fc-908e-4a61-bb2b-905c0f5709a5-trusted-ca\") pod \"c98724fc-908e-4a61-bb2b-905c0f5709a5\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.050068 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c98724fc-908e-4a61-bb2b-905c0f5709a5-registry-certificates\") pod \"c98724fc-908e-4a61-bb2b-905c0f5709a5\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.050103 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c98724fc-908e-4a61-bb2b-905c0f5709a5-ca-trust-extracted\") pod \"c98724fc-908e-4a61-bb2b-905c0f5709a5\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.050125 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqgqx\" (UniqueName: \"kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-kube-api-access-qqgqx\") pod \"c98724fc-908e-4a61-bb2b-905c0f5709a5\" (UID: \"c98724fc-908e-4a61-bb2b-905c0f5709a5\") " Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.053039 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c98724fc-908e-4a61-bb2b-905c0f5709a5-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c98724fc-908e-4a61-bb2b-905c0f5709a5" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.053146 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c98724fc-908e-4a61-bb2b-905c0f5709a5-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c98724fc-908e-4a61-bb2b-905c0f5709a5" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.055366 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-kube-api-access-qqgqx" (OuterVolumeSpecName: "kube-api-access-qqgqx") pod "c98724fc-908e-4a61-bb2b-905c0f5709a5" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5"). InnerVolumeSpecName "kube-api-access-qqgqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.055904 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98724fc-908e-4a61-bb2b-905c0f5709a5-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c98724fc-908e-4a61-bb2b-905c0f5709a5" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.061336 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c98724fc-908e-4a61-bb2b-905c0f5709a5" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.061605 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c98724fc-908e-4a61-bb2b-905c0f5709a5" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.067162 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "c98724fc-908e-4a61-bb2b-905c0f5709a5" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.074347 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c98724fc-908e-4a61-bb2b-905c0f5709a5-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c98724fc-908e-4a61-bb2b-905c0f5709a5" (UID: "c98724fc-908e-4a61-bb2b-905c0f5709a5"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.151975 4885 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c98724fc-908e-4a61-bb2b-905c0f5709a5-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.152070 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqgqx\" (UniqueName: \"kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-kube-api-access-qqgqx\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.152091 4885 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.152117 4885 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c98724fc-908e-4a61-bb2b-905c0f5709a5-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.152128 4885 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c98724fc-908e-4a61-bb2b-905c0f5709a5-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.152138 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c98724fc-908e-4a61-bb2b-905c0f5709a5-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.152148 4885 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c98724fc-908e-4a61-bb2b-905c0f5709a5-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.528571 4885 generic.go:334] "Generic (PLEG): container finished" podID="c98724fc-908e-4a61-bb2b-905c0f5709a5" containerID="3e57ef77fa8f3ece6249172da68606887bea5cd1584954260ea420ff9591b6ba" exitCode=0 Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.528661 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bctth" event={"ID":"c98724fc-908e-4a61-bb2b-905c0f5709a5","Type":"ContainerDied","Data":"3e57ef77fa8f3ece6249172da68606887bea5cd1584954260ea420ff9591b6ba"} Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.528712 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bctth" event={"ID":"c98724fc-908e-4a61-bb2b-905c0f5709a5","Type":"ContainerDied","Data":"7231f8b1bfa265d89f54bb8cda89353e2d9580675a3afcb7ad0d7bee453a1663"} Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.528765 4885 scope.go:117] "RemoveContainer" containerID="3e57ef77fa8f3ece6249172da68606887bea5cd1584954260ea420ff9591b6ba" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.529098 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bctth" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.560818 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bctth"] Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.560932 4885 scope.go:117] "RemoveContainer" containerID="3e57ef77fa8f3ece6249172da68606887bea5cd1584954260ea420ff9591b6ba" Dec 05 20:12:37 crc kubenswrapper[4885]: E1205 20:12:37.561675 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e57ef77fa8f3ece6249172da68606887bea5cd1584954260ea420ff9591b6ba\": container with ID starting with 3e57ef77fa8f3ece6249172da68606887bea5cd1584954260ea420ff9591b6ba not found: ID does not exist" containerID="3e57ef77fa8f3ece6249172da68606887bea5cd1584954260ea420ff9591b6ba" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.561739 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e57ef77fa8f3ece6249172da68606887bea5cd1584954260ea420ff9591b6ba"} err="failed to get container status \"3e57ef77fa8f3ece6249172da68606887bea5cd1584954260ea420ff9591b6ba\": rpc error: code = NotFound desc = could not find container \"3e57ef77fa8f3ece6249172da68606887bea5cd1584954260ea420ff9591b6ba\": container with ID starting with 3e57ef77fa8f3ece6249172da68606887bea5cd1584954260ea420ff9591b6ba not found: ID does not exist" Dec 05 20:12:37 crc kubenswrapper[4885]: I1205 20:12:37.566173 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bctth"] Dec 05 20:12:39 crc kubenswrapper[4885]: I1205 20:12:39.180585 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c98724fc-908e-4a61-bb2b-905c0f5709a5" path="/var/lib/kubelet/pods/c98724fc-908e-4a61-bb2b-905c0f5709a5/volumes" Dec 05 20:12:46 crc kubenswrapper[4885]: I1205 20:12:46.630996 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:12:46 crc kubenswrapper[4885]: I1205 20:12:46.631294 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:12:46 crc kubenswrapper[4885]: I1205 20:12:46.631335 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:12:46 crc kubenswrapper[4885]: I1205 20:12:46.631838 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a568b5e804c681b6f6e3432a30cb5455a65a10c85104a11e96d2fe71376be10"} pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:12:46 crc kubenswrapper[4885]: I1205 20:12:46.631892 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" containerID="cri-o://1a568b5e804c681b6f6e3432a30cb5455a65a10c85104a11e96d2fe71376be10" gracePeriod=600 Dec 05 20:12:47 crc kubenswrapper[4885]: I1205 20:12:47.595194 4885 generic.go:334] "Generic (PLEG): container finished" podID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerID="1a568b5e804c681b6f6e3432a30cb5455a65a10c85104a11e96d2fe71376be10" exitCode=0 Dec 05 20:12:47 crc kubenswrapper[4885]: I1205 20:12:47.595259 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerDied","Data":"1a568b5e804c681b6f6e3432a30cb5455a65a10c85104a11e96d2fe71376be10"} Dec 05 20:12:47 crc kubenswrapper[4885]: I1205 20:12:47.595824 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"53112215960c3263d15b18ec4571f7146c46646867b9a8f4171bc569cf2437c9"} Dec 05 20:12:47 crc kubenswrapper[4885]: I1205 20:12:47.595856 4885 scope.go:117] "RemoveContainer" containerID="ccb103c8a4ec4c0c64800974238f58b2fbf6a39374fd4dad375bba2f486eadda" Dec 05 20:14:46 crc kubenswrapper[4885]: I1205 20:14:46.631391 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:14:46 crc kubenswrapper[4885]: I1205 20:14:46.632355 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.181684 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f"] Dec 05 20:15:00 crc kubenswrapper[4885]: E1205 20:15:00.182546 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98724fc-908e-4a61-bb2b-905c0f5709a5" containerName="registry" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.182562 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98724fc-908e-4a61-bb2b-905c0f5709a5" containerName="registry" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.182676 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98724fc-908e-4a61-bb2b-905c0f5709a5" containerName="registry" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.183077 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.185185 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.185766 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.193336 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f"] Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.278152 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mgb4\" (UniqueName: \"kubernetes.io/projected/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-kube-api-access-5mgb4\") pod \"collect-profiles-29416095-pbz7f\" (UID: \"1caf49a1-5103-4ab8-b2a8-8d395fe66c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.278204 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-secret-volume\") pod \"collect-profiles-29416095-pbz7f\" (UID: \"1caf49a1-5103-4ab8-b2a8-8d395fe66c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.278226 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-config-volume\") pod \"collect-profiles-29416095-pbz7f\" (UID: \"1caf49a1-5103-4ab8-b2a8-8d395fe66c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.379292 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mgb4\" (UniqueName: \"kubernetes.io/projected/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-kube-api-access-5mgb4\") pod \"collect-profiles-29416095-pbz7f\" (UID: \"1caf49a1-5103-4ab8-b2a8-8d395fe66c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.379365 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-secret-volume\") pod \"collect-profiles-29416095-pbz7f\" (UID: \"1caf49a1-5103-4ab8-b2a8-8d395fe66c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.379399 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-config-volume\") pod \"collect-profiles-29416095-pbz7f\" (UID: \"1caf49a1-5103-4ab8-b2a8-8d395fe66c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.380776 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-config-volume\") pod \"collect-profiles-29416095-pbz7f\" (UID: \"1caf49a1-5103-4ab8-b2a8-8d395fe66c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.389175 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-secret-volume\") pod \"collect-profiles-29416095-pbz7f\" (UID: \"1caf49a1-5103-4ab8-b2a8-8d395fe66c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.411142 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mgb4\" (UniqueName: \"kubernetes.io/projected/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-kube-api-access-5mgb4\") pod \"collect-profiles-29416095-pbz7f\" (UID: \"1caf49a1-5103-4ab8-b2a8-8d395fe66c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.500728 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f" Dec 05 20:15:00 crc kubenswrapper[4885]: I1205 20:15:00.714546 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f"] Dec 05 20:15:01 crc kubenswrapper[4885]: I1205 20:15:01.424516 4885 generic.go:334] "Generic (PLEG): container finished" podID="1caf49a1-5103-4ab8-b2a8-8d395fe66c43" containerID="cfee1ab840ed566ecba7ee5d477a3e165b3ab9c508be3f2a408dbc71dcacc39c" exitCode=0 Dec 05 20:15:01 crc kubenswrapper[4885]: I1205 20:15:01.424604 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f" event={"ID":"1caf49a1-5103-4ab8-b2a8-8d395fe66c43","Type":"ContainerDied","Data":"cfee1ab840ed566ecba7ee5d477a3e165b3ab9c508be3f2a408dbc71dcacc39c"} Dec 05 20:15:01 crc kubenswrapper[4885]: I1205 20:15:01.427721 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f" event={"ID":"1caf49a1-5103-4ab8-b2a8-8d395fe66c43","Type":"ContainerStarted","Data":"1085f4ab6056128ede39c629e1aeaa059985ee5e535fabb788f9a1d2cbed51cd"} Dec 05 20:15:02 crc kubenswrapper[4885]: I1205 20:15:02.654427 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f" Dec 05 20:15:02 crc kubenswrapper[4885]: I1205 20:15:02.713728 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-secret-volume\") pod \"1caf49a1-5103-4ab8-b2a8-8d395fe66c43\" (UID: \"1caf49a1-5103-4ab8-b2a8-8d395fe66c43\") " Dec 05 20:15:02 crc kubenswrapper[4885]: I1205 20:15:02.713814 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mgb4\" (UniqueName: \"kubernetes.io/projected/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-kube-api-access-5mgb4\") pod \"1caf49a1-5103-4ab8-b2a8-8d395fe66c43\" (UID: \"1caf49a1-5103-4ab8-b2a8-8d395fe66c43\") " Dec 05 20:15:02 crc kubenswrapper[4885]: I1205 20:15:02.713876 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-config-volume\") pod \"1caf49a1-5103-4ab8-b2a8-8d395fe66c43\" (UID: \"1caf49a1-5103-4ab8-b2a8-8d395fe66c43\") " Dec 05 20:15:02 crc kubenswrapper[4885]: I1205 20:15:02.714811 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-config-volume" (OuterVolumeSpecName: "config-volume") pod "1caf49a1-5103-4ab8-b2a8-8d395fe66c43" (UID: "1caf49a1-5103-4ab8-b2a8-8d395fe66c43"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:02 crc kubenswrapper[4885]: I1205 20:15:02.718623 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1caf49a1-5103-4ab8-b2a8-8d395fe66c43" (UID: "1caf49a1-5103-4ab8-b2a8-8d395fe66c43"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:02 crc kubenswrapper[4885]: I1205 20:15:02.718776 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-kube-api-access-5mgb4" (OuterVolumeSpecName: "kube-api-access-5mgb4") pod "1caf49a1-5103-4ab8-b2a8-8d395fe66c43" (UID: "1caf49a1-5103-4ab8-b2a8-8d395fe66c43"). InnerVolumeSpecName "kube-api-access-5mgb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:15:02 crc kubenswrapper[4885]: I1205 20:15:02.815133 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:02 crc kubenswrapper[4885]: I1205 20:15:02.815169 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:02 crc kubenswrapper[4885]: I1205 20:15:02.815183 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mgb4\" (UniqueName: \"kubernetes.io/projected/1caf49a1-5103-4ab8-b2a8-8d395fe66c43-kube-api-access-5mgb4\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:03 crc kubenswrapper[4885]: I1205 20:15:03.444903 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f" event={"ID":"1caf49a1-5103-4ab8-b2a8-8d395fe66c43","Type":"ContainerDied","Data":"1085f4ab6056128ede39c629e1aeaa059985ee5e535fabb788f9a1d2cbed51cd"} Dec 05 20:15:03 crc kubenswrapper[4885]: I1205 20:15:03.444966 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1085f4ab6056128ede39c629e1aeaa059985ee5e535fabb788f9a1d2cbed51cd" Dec 05 20:15:03 crc kubenswrapper[4885]: I1205 20:15:03.444970 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f" Dec 05 20:15:16 crc kubenswrapper[4885]: I1205 20:15:16.630814 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:15:16 crc kubenswrapper[4885]: I1205 20:15:16.631550 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:15:46 crc kubenswrapper[4885]: I1205 20:15:46.630995 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:15:46 crc kubenswrapper[4885]: I1205 20:15:46.631713 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:15:46 crc kubenswrapper[4885]: I1205 20:15:46.631775 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:15:46 crc kubenswrapper[4885]: I1205 20:15:46.632559 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"53112215960c3263d15b18ec4571f7146c46646867b9a8f4171bc569cf2437c9"} pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:15:46 crc kubenswrapper[4885]: I1205 20:15:46.632650 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" containerID="cri-o://53112215960c3263d15b18ec4571f7146c46646867b9a8f4171bc569cf2437c9" gracePeriod=600 Dec 05 20:15:47 crc kubenswrapper[4885]: I1205 20:15:47.744676 4885 generic.go:334] "Generic (PLEG): container finished" podID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerID="53112215960c3263d15b18ec4571f7146c46646867b9a8f4171bc569cf2437c9" exitCode=0 Dec 05 20:15:47 crc kubenswrapper[4885]: I1205 20:15:47.744870 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerDied","Data":"53112215960c3263d15b18ec4571f7146c46646867b9a8f4171bc569cf2437c9"} Dec 05 20:15:47 crc kubenswrapper[4885]: I1205 20:15:47.745511 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"3a54d873f48017e0ab1882609207d2134ae0f9e98ed286e2389ccf25d46ab55d"} Dec 05 20:15:47 crc kubenswrapper[4885]: I1205 20:15:47.745552 4885 scope.go:117] "RemoveContainer" containerID="1a568b5e804c681b6f6e3432a30cb5455a65a10c85104a11e96d2fe71376be10" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.184070 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-6th47"] Dec 05 20:17:12 crc kubenswrapper[4885]: E1205 20:17:12.184933 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1caf49a1-5103-4ab8-b2a8-8d395fe66c43" containerName="collect-profiles" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.184952 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1caf49a1-5103-4ab8-b2a8-8d395fe66c43" containerName="collect-profiles" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.185121 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1caf49a1-5103-4ab8-b2a8-8d395fe66c43" containerName="collect-profiles" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.185595 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-6th47" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.188410 4885 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-ghpnn" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.188769 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.189000 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.195947 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-z8hk7"] Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.197159 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-z8hk7" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.199433 4885 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-kxq8h" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.211205 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-6th47"] Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.226703 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9kn7\" (UniqueName: \"kubernetes.io/projected/c7c60e10-72a8-4031-8e22-2f7b2ccc720c-kube-api-access-r9kn7\") pod \"cert-manager-5b446d88c5-z8hk7\" (UID: \"c7c60e10-72a8-4031-8e22-2f7b2ccc720c\") " pod="cert-manager/cert-manager-5b446d88c5-z8hk7" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.226764 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xkp8\" (UniqueName: \"kubernetes.io/projected/c3ccb845-eaa6-44fd-b7ea-4f3739516528-kube-api-access-6xkp8\") pod \"cert-manager-cainjector-7f985d654d-6th47\" (UID: \"c3ccb845-eaa6-44fd-b7ea-4f3739516528\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-6th47" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.228759 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-z8hk7"] Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.243664 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-4swqf"] Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.244261 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-4swqf"] Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.244360 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-4swqf" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.246471 4885 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-rwrfx" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.327819 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xkp8\" (UniqueName: \"kubernetes.io/projected/c3ccb845-eaa6-44fd-b7ea-4f3739516528-kube-api-access-6xkp8\") pod \"cert-manager-cainjector-7f985d654d-6th47\" (UID: \"c3ccb845-eaa6-44fd-b7ea-4f3739516528\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-6th47" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.327880 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wbgs\" (UniqueName: \"kubernetes.io/projected/21e6c715-7d1f-405a-9d66-8ac102a2e623-kube-api-access-2wbgs\") pod \"cert-manager-webhook-5655c58dd6-4swqf\" (UID: \"21e6c715-7d1f-405a-9d66-8ac102a2e623\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-4swqf" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.327959 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9kn7\" (UniqueName: \"kubernetes.io/projected/c7c60e10-72a8-4031-8e22-2f7b2ccc720c-kube-api-access-r9kn7\") pod \"cert-manager-5b446d88c5-z8hk7\" (UID: \"c7c60e10-72a8-4031-8e22-2f7b2ccc720c\") " pod="cert-manager/cert-manager-5b446d88c5-z8hk7" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.351553 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xkp8\" (UniqueName: \"kubernetes.io/projected/c3ccb845-eaa6-44fd-b7ea-4f3739516528-kube-api-access-6xkp8\") pod \"cert-manager-cainjector-7f985d654d-6th47\" (UID: \"c3ccb845-eaa6-44fd-b7ea-4f3739516528\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-6th47" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.351830 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9kn7\" (UniqueName: \"kubernetes.io/projected/c7c60e10-72a8-4031-8e22-2f7b2ccc720c-kube-api-access-r9kn7\") pod \"cert-manager-5b446d88c5-z8hk7\" (UID: \"c7c60e10-72a8-4031-8e22-2f7b2ccc720c\") " pod="cert-manager/cert-manager-5b446d88c5-z8hk7" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.429550 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wbgs\" (UniqueName: \"kubernetes.io/projected/21e6c715-7d1f-405a-9d66-8ac102a2e623-kube-api-access-2wbgs\") pod \"cert-manager-webhook-5655c58dd6-4swqf\" (UID: \"21e6c715-7d1f-405a-9d66-8ac102a2e623\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-4swqf" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.445311 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wbgs\" (UniqueName: \"kubernetes.io/projected/21e6c715-7d1f-405a-9d66-8ac102a2e623-kube-api-access-2wbgs\") pod \"cert-manager-webhook-5655c58dd6-4swqf\" (UID: \"21e6c715-7d1f-405a-9d66-8ac102a2e623\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-4swqf" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.511357 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-6th47" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.523163 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-z8hk7" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.557119 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-4swqf" Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.819906 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-4swqf"] Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.827269 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.981796 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-6th47"] Dec 05 20:17:12 crc kubenswrapper[4885]: W1205 20:17:12.983852 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3ccb845_eaa6_44fd_b7ea_4f3739516528.slice/crio-80a8fe2e73a8f040eda71c18d5ecdeaee694cce09e76b353d157518c1fdc2356 WatchSource:0}: Error finding container 80a8fe2e73a8f040eda71c18d5ecdeaee694cce09e76b353d157518c1fdc2356: Status 404 returned error can't find the container with id 80a8fe2e73a8f040eda71c18d5ecdeaee694cce09e76b353d157518c1fdc2356 Dec 05 20:17:12 crc kubenswrapper[4885]: I1205 20:17:12.986378 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-z8hk7"] Dec 05 20:17:12 crc kubenswrapper[4885]: W1205 20:17:12.991276 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7c60e10_72a8_4031_8e22_2f7b2ccc720c.slice/crio-6b3dae10cbc64da81a193fb8b5a8b7131300d4b0bb0cf5f3d15d0b383d92a80c WatchSource:0}: Error finding container 6b3dae10cbc64da81a193fb8b5a8b7131300d4b0bb0cf5f3d15d0b383d92a80c: Status 404 returned error can't find the container with id 6b3dae10cbc64da81a193fb8b5a8b7131300d4b0bb0cf5f3d15d0b383d92a80c Dec 05 20:17:13 crc kubenswrapper[4885]: I1205 20:17:13.283165 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-6th47" event={"ID":"c3ccb845-eaa6-44fd-b7ea-4f3739516528","Type":"ContainerStarted","Data":"80a8fe2e73a8f040eda71c18d5ecdeaee694cce09e76b353d157518c1fdc2356"} Dec 05 20:17:13 crc kubenswrapper[4885]: I1205 20:17:13.284462 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-z8hk7" event={"ID":"c7c60e10-72a8-4031-8e22-2f7b2ccc720c","Type":"ContainerStarted","Data":"6b3dae10cbc64da81a193fb8b5a8b7131300d4b0bb0cf5f3d15d0b383d92a80c"} Dec 05 20:17:13 crc kubenswrapper[4885]: I1205 20:17:13.286042 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-4swqf" event={"ID":"21e6c715-7d1f-405a-9d66-8ac102a2e623","Type":"ContainerStarted","Data":"8e18dfbdd0fe9ffd42102a70167a1aa30281689225b0601142c4d564218aaaec"} Dec 05 20:17:16 crc kubenswrapper[4885]: I1205 20:17:16.304010 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-6th47" event={"ID":"c3ccb845-eaa6-44fd-b7ea-4f3739516528","Type":"ContainerStarted","Data":"c6024fdad37abcf4e684ea60896e6d54fa94af3b3d24e1057ddfaff8aa9d50ae"} Dec 05 20:17:16 crc kubenswrapper[4885]: I1205 20:17:16.306381 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-z8hk7" event={"ID":"c7c60e10-72a8-4031-8e22-2f7b2ccc720c","Type":"ContainerStarted","Data":"f25d97d2a47ae6fdfffa34a2c517150fac3caf5af8879049d3fdaf757a3241c2"} Dec 05 20:17:16 crc kubenswrapper[4885]: I1205 20:17:16.307692 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-4swqf" event={"ID":"21e6c715-7d1f-405a-9d66-8ac102a2e623","Type":"ContainerStarted","Data":"acbc5c524fce82c477576d742714ec5883ae33c3e6ae428ee5920c9b67ff58fa"} Dec 05 20:17:16 crc kubenswrapper[4885]: I1205 20:17:16.308092 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-4swqf" Dec 05 20:17:16 crc kubenswrapper[4885]: I1205 20:17:16.319571 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-6th47" podStartSLOduration=1.402287535 podStartE2EDuration="4.319552658s" podCreationTimestamp="2025-12-05 20:17:12 +0000 UTC" firstStartedPulling="2025-12-05 20:17:12.985808953 +0000 UTC m=+698.282624614" lastFinishedPulling="2025-12-05 20:17:15.903074066 +0000 UTC m=+701.199889737" observedRunningTime="2025-12-05 20:17:16.3167323 +0000 UTC m=+701.613547971" watchObservedRunningTime="2025-12-05 20:17:16.319552658 +0000 UTC m=+701.616368319" Dec 05 20:17:16 crc kubenswrapper[4885]: I1205 20:17:16.338000 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-z8hk7" podStartSLOduration=1.420468069 podStartE2EDuration="4.337982399s" podCreationTimestamp="2025-12-05 20:17:12 +0000 UTC" firstStartedPulling="2025-12-05 20:17:12.993528922 +0000 UTC m=+698.290344603" lastFinishedPulling="2025-12-05 20:17:15.911043262 +0000 UTC m=+701.207858933" observedRunningTime="2025-12-05 20:17:16.334195091 +0000 UTC m=+701.631010762" watchObservedRunningTime="2025-12-05 20:17:16.337982399 +0000 UTC m=+701.634798060" Dec 05 20:17:16 crc kubenswrapper[4885]: I1205 20:17:16.349656 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-4swqf" podStartSLOduration=1.219978473 podStartE2EDuration="4.34963985s" podCreationTimestamp="2025-12-05 20:17:12 +0000 UTC" firstStartedPulling="2025-12-05 20:17:12.827045579 +0000 UTC m=+698.123861240" lastFinishedPulling="2025-12-05 20:17:15.956706956 +0000 UTC m=+701.253522617" observedRunningTime="2025-12-05 20:17:16.347469942 +0000 UTC m=+701.644285603" watchObservedRunningTime="2025-12-05 20:17:16.34963985 +0000 UTC m=+701.646455511" Dec 05 20:17:22 crc kubenswrapper[4885]: I1205 20:17:22.561885 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-4swqf" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.379132 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wx7m6"] Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.379875 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovn-controller" containerID="cri-o://f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e" gracePeriod=30 Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.379914 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="nbdb" containerID="cri-o://5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662" gracePeriod=30 Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.379974 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="northd" containerID="cri-o://22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821" gracePeriod=30 Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.379941 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad" gracePeriod=30 Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.380005 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="kube-rbac-proxy-node" containerID="cri-o://af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4" gracePeriod=30 Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.380131 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="sbdb" containerID="cri-o://284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5" gracePeriod=30 Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.380137 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovn-acl-logging" containerID="cri-o://56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89" gracePeriod=30 Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.424220 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovnkube-controller" containerID="cri-o://4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420" gracePeriod=30 Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.662149 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovnkube-controller/3.log" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.664176 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovn-acl-logging/0.log" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.664682 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovn-controller/0.log" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.665119 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.709819 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wwvx4"] Dec 05 20:17:24 crc kubenswrapper[4885]: E1205 20:17:24.710244 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovnkube-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.710326 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovnkube-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: E1205 20:17:24.710386 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="kube-rbac-proxy-node" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.710456 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="kube-rbac-proxy-node" Dec 05 20:17:24 crc kubenswrapper[4885]: E1205 20:17:24.710519 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="nbdb" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.710585 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="nbdb" Dec 05 20:17:24 crc kubenswrapper[4885]: E1205 20:17:24.710648 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovn-acl-logging" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.710716 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovn-acl-logging" Dec 05 20:17:24 crc kubenswrapper[4885]: E1205 20:17:24.710785 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovnkube-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.710854 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovnkube-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: E1205 20:17:24.710946 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="northd" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.711007 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="northd" Dec 05 20:17:24 crc kubenswrapper[4885]: E1205 20:17:24.711113 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="sbdb" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.711185 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="sbdb" Dec 05 20:17:24 crc kubenswrapper[4885]: E1205 20:17:24.711249 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.711304 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 20:17:24 crc kubenswrapper[4885]: E1205 20:17:24.711366 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovnkube-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.711431 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovnkube-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: E1205 20:17:24.711504 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="kubecfg-setup" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.711566 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="kubecfg-setup" Dec 05 20:17:24 crc kubenswrapper[4885]: E1205 20:17:24.711628 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovn-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.711687 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovn-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.711854 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovnkube-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.711932 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovnkube-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.711995 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="sbdb" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.712113 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovnkube-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.712180 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovnkube-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.712243 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovn-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.712303 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="nbdb" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.712361 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.712415 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="kube-rbac-proxy-node" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.712468 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovnkube-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.712516 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovn-acl-logging" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.712561 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="northd" Dec 05 20:17:24 crc kubenswrapper[4885]: E1205 20:17:24.712686 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovnkube-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.712740 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovnkube-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: E1205 20:17:24.712794 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovnkube-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.712845 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" containerName="ovnkube-controller" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.714475 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.788948 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-node-log\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789049 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-var-lib-openvswitch\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789100 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-node-log" (OuterVolumeSpecName: "node-log") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789172 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789203 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-run-netns\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789234 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789250 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-cni-bin\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789275 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789293 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-ovn\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789316 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789449 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-kubelet\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789494 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dcsp\" (UniqueName: \"kubernetes.io/projected/86ae690a-3705-45ae-8816-da5f33d2105e-kube-api-access-8dcsp\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789520 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-systemd-units\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789546 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789554 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-systemd\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789609 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-cni-netd\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789638 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-log-socket\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789665 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-openvswitch\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789710 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-etc-openvswitch\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789748 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-run-ovn-kubernetes\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789778 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-ovnkube-config\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789801 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/86ae690a-3705-45ae-8816-da5f33d2105e-ovn-node-metrics-cert\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789829 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-env-overrides\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789856 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-ovnkube-script-lib\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789875 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.789896 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-slash\") pod \"86ae690a-3705-45ae-8816-da5f33d2105e\" (UID: \"86ae690a-3705-45ae-8816-da5f33d2105e\") " Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790046 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-run-ovn\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790109 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790114 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-cni-bin\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790151 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790174 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790180 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-slash\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790195 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-log-socket" (OuterVolumeSpecName: "log-socket") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790217 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790228 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-ovnkube-script-lib\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790242 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-slash" (OuterVolumeSpecName: "host-slash") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790268 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-run-systemd\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790283 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790294 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-var-lib-openvswitch\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790369 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-node-log\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790406 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-env-overrides\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790431 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790440 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790473 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790502 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-cni-netd\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790512 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790571 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-log-socket\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790596 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-run-ovn-kubernetes\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790618 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-ovn-node-metrics-cert\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790635 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-kubelet\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790648 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-systemd-units\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790711 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-run-openvswitch\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790726 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-run-netns\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790749 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-ovnkube-config\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790779 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-etc-openvswitch\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790794 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pdvb\" (UniqueName: \"kubernetes.io/projected/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-kube-api-access-6pdvb\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790879 4885 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790893 4885 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790905 4885 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-log-socket\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790916 4885 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790927 4885 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790938 4885 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790949 4885 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790974 4885 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790992 4885 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/86ae690a-3705-45ae-8816-da5f33d2105e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.791002 4885 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-slash\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.791010 4885 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-node-log\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.791033 4885 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.791046 4885 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.791056 4885 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.791066 4885 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.791075 4885 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.790059 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.794791 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86ae690a-3705-45ae-8816-da5f33d2105e-kube-api-access-8dcsp" (OuterVolumeSpecName: "kube-api-access-8dcsp") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "kube-api-access-8dcsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.795268 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86ae690a-3705-45ae-8816-da5f33d2105e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.803227 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "86ae690a-3705-45ae-8816-da5f33d2105e" (UID: "86ae690a-3705-45ae-8816-da5f33d2105e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.891739 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-env-overrides\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.891793 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.891835 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-cni-netd\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.891862 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-log-socket\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.891886 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-run-ovn-kubernetes\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.891909 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-ovn-node-metrics-cert\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.891926 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-cni-netd\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.891941 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.891979 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-log-socket\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.891964 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-kubelet\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.891930 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-kubelet\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.891987 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-run-ovn-kubernetes\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892082 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-systemd-units\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-run-openvswitch\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892155 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-run-netns\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892172 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-ovnkube-config\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892192 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-etc-openvswitch\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892207 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pdvb\" (UniqueName: \"kubernetes.io/projected/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-kube-api-access-6pdvb\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892200 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-run-openvswitch\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892225 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-run-ovn\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892266 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-etc-openvswitch\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892132 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-systemd-units\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892304 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-run-netns\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892246 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-run-ovn\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892312 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-cni-bin\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892347 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-env-overrides\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892358 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-cni-bin\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892383 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-slash\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892399 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-host-slash\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892421 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-ovnkube-script-lib\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892461 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-run-systemd\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892497 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-var-lib-openvswitch\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892532 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-run-systemd\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892554 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-node-log\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892564 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-var-lib-openvswitch\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892656 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-node-log\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892655 4885 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892684 4885 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86ae690a-3705-45ae-8816-da5f33d2105e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892693 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/86ae690a-3705-45ae-8816-da5f33d2105e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892703 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dcsp\" (UniqueName: \"kubernetes.io/projected/86ae690a-3705-45ae-8816-da5f33d2105e-kube-api-access-8dcsp\") on node \"crc\" DevicePath \"\"" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.892987 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-ovnkube-script-lib\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.893354 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-ovnkube-config\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.895416 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-ovn-node-metrics-cert\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:24 crc kubenswrapper[4885]: I1205 20:17:24.909681 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pdvb\" (UniqueName: \"kubernetes.io/projected/0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a-kube-api-access-6pdvb\") pod \"ovnkube-node-wwvx4\" (UID: \"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.032543 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:25 crc kubenswrapper[4885]: W1205 20:17:25.054088 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ceb25a8_cb04_457f_b6d0_6b3f86d7ac7a.slice/crio-8493ae434809c00235df1b997d8ba65ce94c8cc969b87d0511ac40b1c6655a18 WatchSource:0}: Error finding container 8493ae434809c00235df1b997d8ba65ce94c8cc969b87d0511ac40b1c6655a18: Status 404 returned error can't find the container with id 8493ae434809c00235df1b997d8ba65ce94c8cc969b87d0511ac40b1c6655a18 Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.369873 4885 generic.go:334] "Generic (PLEG): container finished" podID="0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a" containerID="b3e2ea3e5bb275fdda79414d204359f5e64f646213170abef1eb08a38cb0144e" exitCode=0 Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.369953 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" event={"ID":"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a","Type":"ContainerDied","Data":"b3e2ea3e5bb275fdda79414d204359f5e64f646213170abef1eb08a38cb0144e"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.370043 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" event={"ID":"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a","Type":"ContainerStarted","Data":"8493ae434809c00235df1b997d8ba65ce94c8cc969b87d0511ac40b1c6655a18"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.382080 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmtwj_c6c25e90-efcc-490c-afef-970c3a62c809/kube-multus/2.log" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.382745 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmtwj_c6c25e90-efcc-490c-afef-970c3a62c809/kube-multus/1.log" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.382811 4885 generic.go:334] "Generic (PLEG): container finished" podID="c6c25e90-efcc-490c-afef-970c3a62c809" containerID="d0608305a462e681e80ef2ee794a2cc5f59edbf5e205a15f06bd9821cf14f5ad" exitCode=2 Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.382902 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmtwj" event={"ID":"c6c25e90-efcc-490c-afef-970c3a62c809","Type":"ContainerDied","Data":"d0608305a462e681e80ef2ee794a2cc5f59edbf5e205a15f06bd9821cf14f5ad"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.382955 4885 scope.go:117] "RemoveContainer" containerID="23633e674cb5832d0d0815f1e0ef1b70ffa2e6c2d92c3fc60d46c9ff7d4cc9ab" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.383834 4885 scope.go:117] "RemoveContainer" containerID="d0608305a462e681e80ef2ee794a2cc5f59edbf5e205a15f06bd9821cf14f5ad" Dec 05 20:17:25 crc kubenswrapper[4885]: E1205 20:17:25.384290 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zmtwj_openshift-multus(c6c25e90-efcc-490c-afef-970c3a62c809)\"" pod="openshift-multus/multus-zmtwj" podUID="c6c25e90-efcc-490c-afef-970c3a62c809" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.387484 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovnkube-controller/3.log" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.398596 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovn-acl-logging/0.log" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.399624 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wx7m6_86ae690a-3705-45ae-8816-da5f33d2105e/ovn-controller/0.log" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400224 4885 generic.go:334] "Generic (PLEG): container finished" podID="86ae690a-3705-45ae-8816-da5f33d2105e" containerID="4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420" exitCode=0 Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400261 4885 generic.go:334] "Generic (PLEG): container finished" podID="86ae690a-3705-45ae-8816-da5f33d2105e" containerID="284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5" exitCode=0 Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400275 4885 generic.go:334] "Generic (PLEG): container finished" podID="86ae690a-3705-45ae-8816-da5f33d2105e" containerID="5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662" exitCode=0 Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400289 4885 generic.go:334] "Generic (PLEG): container finished" podID="86ae690a-3705-45ae-8816-da5f33d2105e" containerID="22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821" exitCode=0 Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400306 4885 generic.go:334] "Generic (PLEG): container finished" podID="86ae690a-3705-45ae-8816-da5f33d2105e" containerID="8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad" exitCode=0 Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400319 4885 generic.go:334] "Generic (PLEG): container finished" podID="86ae690a-3705-45ae-8816-da5f33d2105e" containerID="af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4" exitCode=0 Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400332 4885 generic.go:334] "Generic (PLEG): container finished" podID="86ae690a-3705-45ae-8816-da5f33d2105e" containerID="56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89" exitCode=143 Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400347 4885 generic.go:334] "Generic (PLEG): container finished" podID="86ae690a-3705-45ae-8816-da5f33d2105e" containerID="f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e" exitCode=143 Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400374 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerDied","Data":"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400416 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerDied","Data":"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400438 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerDied","Data":"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400456 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerDied","Data":"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400475 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerDied","Data":"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400492 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerDied","Data":"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400510 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400527 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400539 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400549 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400560 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400571 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400581 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400592 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400603 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400613 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400629 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerDied","Data":"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400649 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400665 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400678 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400694 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400705 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400721 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400731 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400742 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400752 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400762 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400777 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerDied","Data":"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400792 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400804 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400815 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400825 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400836 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400846 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400857 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400867 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400877 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400889 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400903 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" event={"ID":"86ae690a-3705-45ae-8816-da5f33d2105e","Type":"ContainerDied","Data":"f0a478b9735a3f724cc2dc5edceee3817447922bda2d93fc34a194602825bfee"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400918 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400955 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400967 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400977 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400988 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.400999 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.401010 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.401043 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.401054 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.401066 4885 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186"} Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.401218 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wx7m6" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.418902 4885 scope.go:117] "RemoveContainer" containerID="4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.437440 4885 scope.go:117] "RemoveContainer" containerID="ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.479146 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wx7m6"] Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.481989 4885 scope.go:117] "RemoveContainer" containerID="284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.490766 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wx7m6"] Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.524862 4885 scope.go:117] "RemoveContainer" containerID="5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.542920 4885 scope.go:117] "RemoveContainer" containerID="22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.555621 4885 scope.go:117] "RemoveContainer" containerID="8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.569174 4885 scope.go:117] "RemoveContainer" containerID="af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.592004 4885 scope.go:117] "RemoveContainer" containerID="56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.622872 4885 scope.go:117] "RemoveContainer" containerID="f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.638318 4885 scope.go:117] "RemoveContainer" containerID="9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.663758 4885 scope.go:117] "RemoveContainer" containerID="4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420" Dec 05 20:17:25 crc kubenswrapper[4885]: E1205 20:17:25.664448 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420\": container with ID starting with 4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420 not found: ID does not exist" containerID="4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.664492 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420"} err="failed to get container status \"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420\": rpc error: code = NotFound desc = could not find container \"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420\": container with ID starting with 4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.664577 4885 scope.go:117] "RemoveContainer" containerID="ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d" Dec 05 20:17:25 crc kubenswrapper[4885]: E1205 20:17:25.665176 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\": container with ID starting with ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d not found: ID does not exist" containerID="ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.665198 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d"} err="failed to get container status \"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\": rpc error: code = NotFound desc = could not find container \"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\": container with ID starting with ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.665213 4885 scope.go:117] "RemoveContainer" containerID="284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5" Dec 05 20:17:25 crc kubenswrapper[4885]: E1205 20:17:25.665840 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\": container with ID starting with 284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5 not found: ID does not exist" containerID="284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.665894 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5"} err="failed to get container status \"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\": rpc error: code = NotFound desc = could not find container \"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\": container with ID starting with 284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.665931 4885 scope.go:117] "RemoveContainer" containerID="5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662" Dec 05 20:17:25 crc kubenswrapper[4885]: E1205 20:17:25.666291 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\": container with ID starting with 5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662 not found: ID does not exist" containerID="5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.666313 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662"} err="failed to get container status \"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\": rpc error: code = NotFound desc = could not find container \"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\": container with ID starting with 5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.666327 4885 scope.go:117] "RemoveContainer" containerID="22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821" Dec 05 20:17:25 crc kubenswrapper[4885]: E1205 20:17:25.666626 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\": container with ID starting with 22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821 not found: ID does not exist" containerID="22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.666651 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821"} err="failed to get container status \"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\": rpc error: code = NotFound desc = could not find container \"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\": container with ID starting with 22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.666661 4885 scope.go:117] "RemoveContainer" containerID="8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad" Dec 05 20:17:25 crc kubenswrapper[4885]: E1205 20:17:25.667135 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\": container with ID starting with 8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad not found: ID does not exist" containerID="8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.667170 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad"} err="failed to get container status \"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\": rpc error: code = NotFound desc = could not find container \"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\": container with ID starting with 8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.667190 4885 scope.go:117] "RemoveContainer" containerID="af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4" Dec 05 20:17:25 crc kubenswrapper[4885]: E1205 20:17:25.667460 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\": container with ID starting with af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4 not found: ID does not exist" containerID="af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.667487 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4"} err="failed to get container status \"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\": rpc error: code = NotFound desc = could not find container \"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\": container with ID starting with af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.667507 4885 scope.go:117] "RemoveContainer" containerID="56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89" Dec 05 20:17:25 crc kubenswrapper[4885]: E1205 20:17:25.667827 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\": container with ID starting with 56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89 not found: ID does not exist" containerID="56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.667851 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89"} err="failed to get container status \"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\": rpc error: code = NotFound desc = could not find container \"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\": container with ID starting with 56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.667869 4885 scope.go:117] "RemoveContainer" containerID="f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e" Dec 05 20:17:25 crc kubenswrapper[4885]: E1205 20:17:25.668282 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\": container with ID starting with f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e not found: ID does not exist" containerID="f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.668305 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e"} err="failed to get container status \"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\": rpc error: code = NotFound desc = could not find container \"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\": container with ID starting with f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.668322 4885 scope.go:117] "RemoveContainer" containerID="9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186" Dec 05 20:17:25 crc kubenswrapper[4885]: E1205 20:17:25.668614 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\": container with ID starting with 9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186 not found: ID does not exist" containerID="9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.668636 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186"} err="failed to get container status \"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\": rpc error: code = NotFound desc = could not find container \"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\": container with ID starting with 9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.668656 4885 scope.go:117] "RemoveContainer" containerID="4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.668879 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420"} err="failed to get container status \"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420\": rpc error: code = NotFound desc = could not find container \"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420\": container with ID starting with 4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.668900 4885 scope.go:117] "RemoveContainer" containerID="ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.669464 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d"} err="failed to get container status \"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\": rpc error: code = NotFound desc = could not find container \"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\": container with ID starting with ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.669482 4885 scope.go:117] "RemoveContainer" containerID="284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.669744 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5"} err="failed to get container status \"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\": rpc error: code = NotFound desc = could not find container \"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\": container with ID starting with 284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.669765 4885 scope.go:117] "RemoveContainer" containerID="5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.669994 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662"} err="failed to get container status \"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\": rpc error: code = NotFound desc = could not find container \"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\": container with ID starting with 5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.670041 4885 scope.go:117] "RemoveContainer" containerID="22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.670328 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821"} err="failed to get container status \"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\": rpc error: code = NotFound desc = could not find container \"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\": container with ID starting with 22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.670347 4885 scope.go:117] "RemoveContainer" containerID="8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.670539 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad"} err="failed to get container status \"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\": rpc error: code = NotFound desc = could not find container \"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\": container with ID starting with 8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.670555 4885 scope.go:117] "RemoveContainer" containerID="af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.670747 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4"} err="failed to get container status \"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\": rpc error: code = NotFound desc = could not find container \"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\": container with ID starting with af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.670784 4885 scope.go:117] "RemoveContainer" containerID="56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.671032 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89"} err="failed to get container status \"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\": rpc error: code = NotFound desc = could not find container \"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\": container with ID starting with 56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.671051 4885 scope.go:117] "RemoveContainer" containerID="f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.671283 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e"} err="failed to get container status \"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\": rpc error: code = NotFound desc = could not find container \"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\": container with ID starting with f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.671304 4885 scope.go:117] "RemoveContainer" containerID="9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.671660 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186"} err="failed to get container status \"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\": rpc error: code = NotFound desc = could not find container \"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\": container with ID starting with 9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.671684 4885 scope.go:117] "RemoveContainer" containerID="4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.671887 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420"} err="failed to get container status \"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420\": rpc error: code = NotFound desc = could not find container \"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420\": container with ID starting with 4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.671907 4885 scope.go:117] "RemoveContainer" containerID="ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.672114 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d"} err="failed to get container status \"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\": rpc error: code = NotFound desc = could not find container \"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\": container with ID starting with ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.672137 4885 scope.go:117] "RemoveContainer" containerID="284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.672354 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5"} err="failed to get container status \"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\": rpc error: code = NotFound desc = could not find container \"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\": container with ID starting with 284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.672371 4885 scope.go:117] "RemoveContainer" containerID="5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.672663 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662"} err="failed to get container status \"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\": rpc error: code = NotFound desc = could not find container \"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\": container with ID starting with 5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.672685 4885 scope.go:117] "RemoveContainer" containerID="22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.672964 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821"} err="failed to get container status \"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\": rpc error: code = NotFound desc = could not find container \"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\": container with ID starting with 22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.673032 4885 scope.go:117] "RemoveContainer" containerID="8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.673382 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad"} err="failed to get container status \"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\": rpc error: code = NotFound desc = could not find container \"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\": container with ID starting with 8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.673402 4885 scope.go:117] "RemoveContainer" containerID="af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.673690 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4"} err="failed to get container status \"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\": rpc error: code = NotFound desc = could not find container \"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\": container with ID starting with af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.673710 4885 scope.go:117] "RemoveContainer" containerID="56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.673924 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89"} err="failed to get container status \"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\": rpc error: code = NotFound desc = could not find container \"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\": container with ID starting with 56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.673945 4885 scope.go:117] "RemoveContainer" containerID="f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.674232 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e"} err="failed to get container status \"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\": rpc error: code = NotFound desc = could not find container \"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\": container with ID starting with f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.674270 4885 scope.go:117] "RemoveContainer" containerID="9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.674536 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186"} err="failed to get container status \"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\": rpc error: code = NotFound desc = could not find container \"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\": container with ID starting with 9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.674559 4885 scope.go:117] "RemoveContainer" containerID="4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.674794 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420"} err="failed to get container status \"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420\": rpc error: code = NotFound desc = could not find container \"4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420\": container with ID starting with 4af3f4a218fc6906aeaf2a2b45bb5ad1f4318d80e90d025d76e80518319fa420 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.674822 4885 scope.go:117] "RemoveContainer" containerID="ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.675050 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d"} err="failed to get container status \"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\": rpc error: code = NotFound desc = could not find container \"ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d\": container with ID starting with ed4dda0686339664acf2de50aee80fe788adb7c1f67350b7c8f7787b8de44a7d not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.675072 4885 scope.go:117] "RemoveContainer" containerID="284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.675321 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5"} err="failed to get container status \"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\": rpc error: code = NotFound desc = could not find container \"284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5\": container with ID starting with 284cc4216a56c0f138c21c211596234e96b96587765ae802bf193a80b786efa5 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.675343 4885 scope.go:117] "RemoveContainer" containerID="5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.675674 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662"} err="failed to get container status \"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\": rpc error: code = NotFound desc = could not find container \"5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662\": container with ID starting with 5c2e298a9764b1109458870263fa8dd0955737a57a636021c63cc2267f3f5662 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.675728 4885 scope.go:117] "RemoveContainer" containerID="22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.676367 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821"} err="failed to get container status \"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\": rpc error: code = NotFound desc = could not find container \"22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821\": container with ID starting with 22448d6945ec815156d8511acde2720ca819f2bec45738400813707da7229821 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.676386 4885 scope.go:117] "RemoveContainer" containerID="8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.676741 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad"} err="failed to get container status \"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\": rpc error: code = NotFound desc = could not find container \"8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad\": container with ID starting with 8d9dd9b7539cc4681a06704d317e921ab6658030c3484fb761626203167177ad not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.676781 4885 scope.go:117] "RemoveContainer" containerID="af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.677564 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4"} err="failed to get container status \"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\": rpc error: code = NotFound desc = could not find container \"af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4\": container with ID starting with af01bcfc3884ce0826fbca36aaf16900b5a18d41faafce69a9833010d227ebf4 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.677611 4885 scope.go:117] "RemoveContainer" containerID="56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.678168 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89"} err="failed to get container status \"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\": rpc error: code = NotFound desc = could not find container \"56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89\": container with ID starting with 56fcc34781bfdfc3ec8bdb9da36027b851e5ae8147b49c06812afaec7a110d89 not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.678188 4885 scope.go:117] "RemoveContainer" containerID="f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.678397 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e"} err="failed to get container status \"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\": rpc error: code = NotFound desc = could not find container \"f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e\": container with ID starting with f13da89a869afd37e51b292f757b1db679cf2b661811a408e82e54faa398f83e not found: ID does not exist" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.678414 4885 scope.go:117] "RemoveContainer" containerID="9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186" Dec 05 20:17:25 crc kubenswrapper[4885]: I1205 20:17:25.678883 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186"} err="failed to get container status \"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\": rpc error: code = NotFound desc = could not find container \"9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186\": container with ID starting with 9ae1a9b170fb3d4b09ec48a5338f11f0ba3e5d51f3556ae53e2e010098a31186 not found: ID does not exist" Dec 05 20:17:26 crc kubenswrapper[4885]: I1205 20:17:26.408782 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmtwj_c6c25e90-efcc-490c-afef-970c3a62c809/kube-multus/2.log" Dec 05 20:17:26 crc kubenswrapper[4885]: I1205 20:17:26.414255 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" event={"ID":"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a","Type":"ContainerStarted","Data":"3e49a94063989ddb8607be3fe7ccf520f4299c6deb5bc2bdc5d0629087578f95"} Dec 05 20:17:26 crc kubenswrapper[4885]: I1205 20:17:26.414510 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" event={"ID":"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a","Type":"ContainerStarted","Data":"b3c55e7950c27fcd63e104ecf6e1a0fb3ddd3aed30dfd9a02952da5398ec5df1"} Dec 05 20:17:26 crc kubenswrapper[4885]: I1205 20:17:26.414520 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" event={"ID":"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a","Type":"ContainerStarted","Data":"86a176c452de130e45b7855dbcae30216bf1f3a5394ef626fdeaade2b5b67dfb"} Dec 05 20:17:26 crc kubenswrapper[4885]: I1205 20:17:26.414529 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" event={"ID":"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a","Type":"ContainerStarted","Data":"472856e868a1dcd590237c4f771b9415ffdbf736420fb468edaaa8db4aaeff68"} Dec 05 20:17:26 crc kubenswrapper[4885]: I1205 20:17:26.414537 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" event={"ID":"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a","Type":"ContainerStarted","Data":"4850472a6dcc3559b6d63fd89ed94afbc2b9d32a82a718adc1f07efc984eef1a"} Dec 05 20:17:26 crc kubenswrapper[4885]: I1205 20:17:26.414545 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" event={"ID":"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a","Type":"ContainerStarted","Data":"94875b18cf19b67ff43101c439d56fbbd6229fea5299643a733e78be189e6f1d"} Dec 05 20:17:27 crc kubenswrapper[4885]: I1205 20:17:27.184685 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86ae690a-3705-45ae-8816-da5f33d2105e" path="/var/lib/kubelet/pods/86ae690a-3705-45ae-8816-da5f33d2105e/volumes" Dec 05 20:17:28 crc kubenswrapper[4885]: I1205 20:17:28.431786 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" event={"ID":"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a","Type":"ContainerStarted","Data":"9d8a260619efc9e3c37cf5910a90ac59de36ba946e3a0000604ce3589fd0ae15"} Dec 05 20:17:31 crc kubenswrapper[4885]: I1205 20:17:31.453795 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" event={"ID":"0ceb25a8-cb04-457f-b6d0-6b3f86d7ac7a","Type":"ContainerStarted","Data":"903c7d95ebe389794f36321d27d0ddc4dd19b768c436241c068605bc03ac2ec1"} Dec 05 20:17:31 crc kubenswrapper[4885]: I1205 20:17:31.454124 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:31 crc kubenswrapper[4885]: I1205 20:17:31.490647 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" podStartSLOduration=7.490622634 podStartE2EDuration="7.490622634s" podCreationTimestamp="2025-12-05 20:17:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:17:31.48986092 +0000 UTC m=+716.786676581" watchObservedRunningTime="2025-12-05 20:17:31.490622634 +0000 UTC m=+716.787438305" Dec 05 20:17:31 crc kubenswrapper[4885]: I1205 20:17:31.496004 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:32 crc kubenswrapper[4885]: I1205 20:17:32.459966 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:32 crc kubenswrapper[4885]: I1205 20:17:32.460389 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:32 crc kubenswrapper[4885]: I1205 20:17:32.527466 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:17:37 crc kubenswrapper[4885]: I1205 20:17:37.173197 4885 scope.go:117] "RemoveContainer" containerID="d0608305a462e681e80ef2ee794a2cc5f59edbf5e205a15f06bd9821cf14f5ad" Dec 05 20:17:37 crc kubenswrapper[4885]: E1205 20:17:37.174458 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zmtwj_openshift-multus(c6c25e90-efcc-490c-afef-970c3a62c809)\"" pod="openshift-multus/multus-zmtwj" podUID="c6c25e90-efcc-490c-afef-970c3a62c809" Dec 05 20:17:46 crc kubenswrapper[4885]: I1205 20:17:46.631406 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:17:46 crc kubenswrapper[4885]: I1205 20:17:46.634212 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:17:49 crc kubenswrapper[4885]: I1205 20:17:49.172849 4885 scope.go:117] "RemoveContainer" containerID="d0608305a462e681e80ef2ee794a2cc5f59edbf5e205a15f06bd9821cf14f5ad" Dec 05 20:17:49 crc kubenswrapper[4885]: I1205 20:17:49.574306 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmtwj_c6c25e90-efcc-490c-afef-970c3a62c809/kube-multus/2.log" Dec 05 20:17:49 crc kubenswrapper[4885]: I1205 20:17:49.574630 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmtwj" event={"ID":"c6c25e90-efcc-490c-afef-970c3a62c809","Type":"ContainerStarted","Data":"ed322da84953899c4cf4b8867efffd721f05aff0fd2c32baff7f596814250c9d"} Dec 05 20:17:55 crc kubenswrapper[4885]: I1205 20:17:55.059953 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wwvx4" Dec 05 20:18:02 crc kubenswrapper[4885]: I1205 20:18:02.872253 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8"] Dec 05 20:18:02 crc kubenswrapper[4885]: I1205 20:18:02.874403 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" Dec 05 20:18:02 crc kubenswrapper[4885]: I1205 20:18:02.876524 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 20:18:02 crc kubenswrapper[4885]: I1205 20:18:02.882386 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8"] Dec 05 20:18:03 crc kubenswrapper[4885]: I1205 20:18:03.004755 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4c07e66-01e1-4851-92f0-2e498a2f04bf-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8\" (UID: \"b4c07e66-01e1-4851-92f0-2e498a2f04bf\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" Dec 05 20:18:03 crc kubenswrapper[4885]: I1205 20:18:03.004844 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4c07e66-01e1-4851-92f0-2e498a2f04bf-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8\" (UID: \"b4c07e66-01e1-4851-92f0-2e498a2f04bf\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" Dec 05 20:18:03 crc kubenswrapper[4885]: I1205 20:18:03.004880 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftwd8\" (UniqueName: \"kubernetes.io/projected/b4c07e66-01e1-4851-92f0-2e498a2f04bf-kube-api-access-ftwd8\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8\" (UID: \"b4c07e66-01e1-4851-92f0-2e498a2f04bf\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" Dec 05 20:18:03 crc kubenswrapper[4885]: I1205 20:18:03.106389 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4c07e66-01e1-4851-92f0-2e498a2f04bf-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8\" (UID: \"b4c07e66-01e1-4851-92f0-2e498a2f04bf\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" Dec 05 20:18:03 crc kubenswrapper[4885]: I1205 20:18:03.106486 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftwd8\" (UniqueName: \"kubernetes.io/projected/b4c07e66-01e1-4851-92f0-2e498a2f04bf-kube-api-access-ftwd8\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8\" (UID: \"b4c07e66-01e1-4851-92f0-2e498a2f04bf\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" Dec 05 20:18:03 crc kubenswrapper[4885]: I1205 20:18:03.106583 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4c07e66-01e1-4851-92f0-2e498a2f04bf-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8\" (UID: \"b4c07e66-01e1-4851-92f0-2e498a2f04bf\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" Dec 05 20:18:03 crc kubenswrapper[4885]: I1205 20:18:03.107322 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4c07e66-01e1-4851-92f0-2e498a2f04bf-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8\" (UID: \"b4c07e66-01e1-4851-92f0-2e498a2f04bf\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" Dec 05 20:18:03 crc kubenswrapper[4885]: I1205 20:18:03.107353 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4c07e66-01e1-4851-92f0-2e498a2f04bf-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8\" (UID: \"b4c07e66-01e1-4851-92f0-2e498a2f04bf\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" Dec 05 20:18:03 crc kubenswrapper[4885]: I1205 20:18:03.140506 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftwd8\" (UniqueName: \"kubernetes.io/projected/b4c07e66-01e1-4851-92f0-2e498a2f04bf-kube-api-access-ftwd8\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8\" (UID: \"b4c07e66-01e1-4851-92f0-2e498a2f04bf\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" Dec 05 20:18:03 crc kubenswrapper[4885]: I1205 20:18:03.198564 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" Dec 05 20:18:03 crc kubenswrapper[4885]: I1205 20:18:03.436598 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8"] Dec 05 20:18:03 crc kubenswrapper[4885]: I1205 20:18:03.658935 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" event={"ID":"b4c07e66-01e1-4851-92f0-2e498a2f04bf","Type":"ContainerStarted","Data":"e06a7d793c1700d59f33f4d072fc0178ab5f2ba6fc4de7aed0f4ac3e47a9afbc"} Dec 05 20:18:03 crc kubenswrapper[4885]: I1205 20:18:03.658993 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" event={"ID":"b4c07e66-01e1-4851-92f0-2e498a2f04bf","Type":"ContainerStarted","Data":"12065e39b32fb3ac6ac136e16e4853ff3e78e60b0f8f6f2decb68a745f95fb96"} Dec 05 20:18:04 crc kubenswrapper[4885]: E1205 20:18:04.499423 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4c07e66_01e1_4851_92f0_2e498a2f04bf.slice/crio-conmon-e06a7d793c1700d59f33f4d072fc0178ab5f2ba6fc4de7aed0f4ac3e47a9afbc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4c07e66_01e1_4851_92f0_2e498a2f04bf.slice/crio-e06a7d793c1700d59f33f4d072fc0178ab5f2ba6fc4de7aed0f4ac3e47a9afbc.scope\": RecentStats: unable to find data in memory cache]" Dec 05 20:18:04 crc kubenswrapper[4885]: I1205 20:18:04.664639 4885 generic.go:334] "Generic (PLEG): container finished" podID="b4c07e66-01e1-4851-92f0-2e498a2f04bf" containerID="e06a7d793c1700d59f33f4d072fc0178ab5f2ba6fc4de7aed0f4ac3e47a9afbc" exitCode=0 Dec 05 20:18:04 crc kubenswrapper[4885]: I1205 20:18:04.664694 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" event={"ID":"b4c07e66-01e1-4851-92f0-2e498a2f04bf","Type":"ContainerDied","Data":"e06a7d793c1700d59f33f4d072fc0178ab5f2ba6fc4de7aed0f4ac3e47a9afbc"} Dec 05 20:18:06 crc kubenswrapper[4885]: I1205 20:18:06.678679 4885 generic.go:334] "Generic (PLEG): container finished" podID="b4c07e66-01e1-4851-92f0-2e498a2f04bf" containerID="5b0a4d933034940519dcc99540a1203d1bd735ed139bfaec8686965925bccc6c" exitCode=0 Dec 05 20:18:06 crc kubenswrapper[4885]: I1205 20:18:06.678790 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" event={"ID":"b4c07e66-01e1-4851-92f0-2e498a2f04bf","Type":"ContainerDied","Data":"5b0a4d933034940519dcc99540a1203d1bd735ed139bfaec8686965925bccc6c"} Dec 05 20:18:07 crc kubenswrapper[4885]: I1205 20:18:07.687989 4885 generic.go:334] "Generic (PLEG): container finished" podID="b4c07e66-01e1-4851-92f0-2e498a2f04bf" containerID="5b093ea2a2815cc8c112abd0a3442883f07cdb5e68fb0b9db02b153eb0bcd9a0" exitCode=0 Dec 05 20:18:07 crc kubenswrapper[4885]: I1205 20:18:07.688121 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" event={"ID":"b4c07e66-01e1-4851-92f0-2e498a2f04bf","Type":"ContainerDied","Data":"5b093ea2a2815cc8c112abd0a3442883f07cdb5e68fb0b9db02b153eb0bcd9a0"} Dec 05 20:18:08 crc kubenswrapper[4885]: I1205 20:18:08.953363 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" Dec 05 20:18:09 crc kubenswrapper[4885]: I1205 20:18:09.084501 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4c07e66-01e1-4851-92f0-2e498a2f04bf-bundle\") pod \"b4c07e66-01e1-4851-92f0-2e498a2f04bf\" (UID: \"b4c07e66-01e1-4851-92f0-2e498a2f04bf\") " Dec 05 20:18:09 crc kubenswrapper[4885]: I1205 20:18:09.084911 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4c07e66-01e1-4851-92f0-2e498a2f04bf-util\") pod \"b4c07e66-01e1-4851-92f0-2e498a2f04bf\" (UID: \"b4c07e66-01e1-4851-92f0-2e498a2f04bf\") " Dec 05 20:18:09 crc kubenswrapper[4885]: I1205 20:18:09.084991 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftwd8\" (UniqueName: \"kubernetes.io/projected/b4c07e66-01e1-4851-92f0-2e498a2f04bf-kube-api-access-ftwd8\") pod \"b4c07e66-01e1-4851-92f0-2e498a2f04bf\" (UID: \"b4c07e66-01e1-4851-92f0-2e498a2f04bf\") " Dec 05 20:18:09 crc kubenswrapper[4885]: I1205 20:18:09.085482 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4c07e66-01e1-4851-92f0-2e498a2f04bf-bundle" (OuterVolumeSpecName: "bundle") pod "b4c07e66-01e1-4851-92f0-2e498a2f04bf" (UID: "b4c07e66-01e1-4851-92f0-2e498a2f04bf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:18:09 crc kubenswrapper[4885]: I1205 20:18:09.091746 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4c07e66-01e1-4851-92f0-2e498a2f04bf-kube-api-access-ftwd8" (OuterVolumeSpecName: "kube-api-access-ftwd8") pod "b4c07e66-01e1-4851-92f0-2e498a2f04bf" (UID: "b4c07e66-01e1-4851-92f0-2e498a2f04bf"). InnerVolumeSpecName "kube-api-access-ftwd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:18:09 crc kubenswrapper[4885]: I1205 20:18:09.186480 4885 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4c07e66-01e1-4851-92f0-2e498a2f04bf-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:18:09 crc kubenswrapper[4885]: I1205 20:18:09.186511 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftwd8\" (UniqueName: \"kubernetes.io/projected/b4c07e66-01e1-4851-92f0-2e498a2f04bf-kube-api-access-ftwd8\") on node \"crc\" DevicePath \"\"" Dec 05 20:18:09 crc kubenswrapper[4885]: I1205 20:18:09.315494 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4c07e66-01e1-4851-92f0-2e498a2f04bf-util" (OuterVolumeSpecName: "util") pod "b4c07e66-01e1-4851-92f0-2e498a2f04bf" (UID: "b4c07e66-01e1-4851-92f0-2e498a2f04bf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:18:09 crc kubenswrapper[4885]: I1205 20:18:09.389042 4885 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4c07e66-01e1-4851-92f0-2e498a2f04bf-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:18:09 crc kubenswrapper[4885]: I1205 20:18:09.702641 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" event={"ID":"b4c07e66-01e1-4851-92f0-2e498a2f04bf","Type":"ContainerDied","Data":"12065e39b32fb3ac6ac136e16e4853ff3e78e60b0f8f6f2decb68a745f95fb96"} Dec 05 20:18:09 crc kubenswrapper[4885]: I1205 20:18:09.702699 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8" Dec 05 20:18:09 crc kubenswrapper[4885]: I1205 20:18:09.702701 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12065e39b32fb3ac6ac136e16e4853ff3e78e60b0f8f6f2decb68a745f95fb96" Dec 05 20:18:14 crc kubenswrapper[4885]: I1205 20:18:14.441061 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-p8qxg"] Dec 05 20:18:14 crc kubenswrapper[4885]: E1205 20:18:14.441515 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c07e66-01e1-4851-92f0-2e498a2f04bf" containerName="util" Dec 05 20:18:14 crc kubenswrapper[4885]: I1205 20:18:14.441529 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c07e66-01e1-4851-92f0-2e498a2f04bf" containerName="util" Dec 05 20:18:14 crc kubenswrapper[4885]: E1205 20:18:14.441545 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c07e66-01e1-4851-92f0-2e498a2f04bf" containerName="pull" Dec 05 20:18:14 crc kubenswrapper[4885]: I1205 20:18:14.441552 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c07e66-01e1-4851-92f0-2e498a2f04bf" containerName="pull" Dec 05 20:18:14 crc kubenswrapper[4885]: E1205 20:18:14.441562 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c07e66-01e1-4851-92f0-2e498a2f04bf" containerName="extract" Dec 05 20:18:14 crc kubenswrapper[4885]: I1205 20:18:14.441568 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c07e66-01e1-4851-92f0-2e498a2f04bf" containerName="extract" Dec 05 20:18:14 crc kubenswrapper[4885]: I1205 20:18:14.441661 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c07e66-01e1-4851-92f0-2e498a2f04bf" containerName="extract" Dec 05 20:18:14 crc kubenswrapper[4885]: I1205 20:18:14.442040 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-p8qxg" Dec 05 20:18:14 crc kubenswrapper[4885]: I1205 20:18:14.443957 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 05 20:18:14 crc kubenswrapper[4885]: I1205 20:18:14.444326 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-7prvp" Dec 05 20:18:14 crc kubenswrapper[4885]: I1205 20:18:14.444347 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 05 20:18:14 crc kubenswrapper[4885]: I1205 20:18:14.458244 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-p8qxg"] Dec 05 20:18:14 crc kubenswrapper[4885]: I1205 20:18:14.557414 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtdzn\" (UniqueName: \"kubernetes.io/projected/5275a59b-4935-4ce8-8552-ed28f0377be5-kube-api-access-dtdzn\") pod \"nmstate-operator-5b5b58f5c8-p8qxg\" (UID: \"5275a59b-4935-4ce8-8552-ed28f0377be5\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-p8qxg" Dec 05 20:18:14 crc kubenswrapper[4885]: I1205 20:18:14.658703 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtdzn\" (UniqueName: \"kubernetes.io/projected/5275a59b-4935-4ce8-8552-ed28f0377be5-kube-api-access-dtdzn\") pod \"nmstate-operator-5b5b58f5c8-p8qxg\" (UID: \"5275a59b-4935-4ce8-8552-ed28f0377be5\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-p8qxg" Dec 05 20:18:14 crc kubenswrapper[4885]: I1205 20:18:14.676942 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtdzn\" (UniqueName: \"kubernetes.io/projected/5275a59b-4935-4ce8-8552-ed28f0377be5-kube-api-access-dtdzn\") pod \"nmstate-operator-5b5b58f5c8-p8qxg\" (UID: \"5275a59b-4935-4ce8-8552-ed28f0377be5\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-p8qxg" Dec 05 20:18:14 crc kubenswrapper[4885]: I1205 20:18:14.760578 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-p8qxg" Dec 05 20:18:15 crc kubenswrapper[4885]: I1205 20:18:15.007841 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-p8qxg"] Dec 05 20:18:15 crc kubenswrapper[4885]: I1205 20:18:15.732702 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-p8qxg" event={"ID":"5275a59b-4935-4ce8-8552-ed28f0377be5","Type":"ContainerStarted","Data":"c7af4f6db876369a0f8acbc61de593478e4c2d5381951fe187edf7e58d316764"} Dec 05 20:18:15 crc kubenswrapper[4885]: I1205 20:18:15.926564 4885 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 20:18:16 crc kubenswrapper[4885]: I1205 20:18:16.631072 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:18:16 crc kubenswrapper[4885]: I1205 20:18:16.631137 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:18:17 crc kubenswrapper[4885]: I1205 20:18:17.761327 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-p8qxg" event={"ID":"5275a59b-4935-4ce8-8552-ed28f0377be5","Type":"ContainerStarted","Data":"a539cbf0a949eb039b5c686fb9305338046ee091998530f0ca23b9b047da8c08"} Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.226528 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-p8qxg" podStartSLOduration=7.261175289 podStartE2EDuration="9.226510832s" podCreationTimestamp="2025-12-05 20:18:14 +0000 UTC" firstStartedPulling="2025-12-05 20:18:15.01343996 +0000 UTC m=+760.310255621" lastFinishedPulling="2025-12-05 20:18:16.978775503 +0000 UTC m=+762.275591164" observedRunningTime="2025-12-05 20:18:17.787737581 +0000 UTC m=+763.084553252" watchObservedRunningTime="2025-12-05 20:18:23.226510832 +0000 UTC m=+768.523326493" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.230706 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-lhpld"] Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.231541 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lhpld" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.235393 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-894mj" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.241225 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-lhpld"] Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.244124 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7"] Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.244775 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.246873 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.255802 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-ndhsp"] Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.256656 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-ndhsp" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.275783 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7"] Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.278127 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltzxq\" (UniqueName: \"kubernetes.io/projected/7001b6ac-1126-4d81-9148-47e6f7f830c1-kube-api-access-ltzxq\") pod \"nmstate-webhook-5f6d4c5ccb-ph4g7\" (UID: \"7001b6ac-1126-4d81-9148-47e6f7f830c1\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.278172 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9p8l\" (UniqueName: \"kubernetes.io/projected/7e345d16-e7f9-4881-a031-eb5ef37e22b3-kube-api-access-k9p8l\") pod \"nmstate-metrics-7f946cbc9-lhpld\" (UID: \"7e345d16-e7f9-4881-a031-eb5ef37e22b3\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lhpld" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.278191 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7001b6ac-1126-4d81-9148-47e6f7f830c1-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-ph4g7\" (UID: \"7001b6ac-1126-4d81-9148-47e6f7f830c1\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.278210 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/912fc0d4-121a-4073-9e85-a2277a5078d8-ovs-socket\") pod \"nmstate-handler-ndhsp\" (UID: \"912fc0d4-121a-4073-9e85-a2277a5078d8\") " pod="openshift-nmstate/nmstate-handler-ndhsp" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.278234 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/912fc0d4-121a-4073-9e85-a2277a5078d8-nmstate-lock\") pod \"nmstate-handler-ndhsp\" (UID: \"912fc0d4-121a-4073-9e85-a2277a5078d8\") " pod="openshift-nmstate/nmstate-handler-ndhsp" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.278310 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-476vk\" (UniqueName: \"kubernetes.io/projected/912fc0d4-121a-4073-9e85-a2277a5078d8-kube-api-access-476vk\") pod \"nmstate-handler-ndhsp\" (UID: \"912fc0d4-121a-4073-9e85-a2277a5078d8\") " pod="openshift-nmstate/nmstate-handler-ndhsp" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.278334 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/912fc0d4-121a-4073-9e85-a2277a5078d8-dbus-socket\") pod \"nmstate-handler-ndhsp\" (UID: \"912fc0d4-121a-4073-9e85-a2277a5078d8\") " pod="openshift-nmstate/nmstate-handler-ndhsp" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.365006 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc"] Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.366158 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.369806 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc"] Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.370846 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.370866 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.370871 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-c6hjj" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.378950 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltzxq\" (UniqueName: \"kubernetes.io/projected/7001b6ac-1126-4d81-9148-47e6f7f830c1-kube-api-access-ltzxq\") pod \"nmstate-webhook-5f6d4c5ccb-ph4g7\" (UID: \"7001b6ac-1126-4d81-9148-47e6f7f830c1\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.378988 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9p8l\" (UniqueName: \"kubernetes.io/projected/7e345d16-e7f9-4881-a031-eb5ef37e22b3-kube-api-access-k9p8l\") pod \"nmstate-metrics-7f946cbc9-lhpld\" (UID: \"7e345d16-e7f9-4881-a031-eb5ef37e22b3\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lhpld" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.379006 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7001b6ac-1126-4d81-9148-47e6f7f830c1-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-ph4g7\" (UID: \"7001b6ac-1126-4d81-9148-47e6f7f830c1\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.379040 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/912fc0d4-121a-4073-9e85-a2277a5078d8-ovs-socket\") pod \"nmstate-handler-ndhsp\" (UID: \"912fc0d4-121a-4073-9e85-a2277a5078d8\") " pod="openshift-nmstate/nmstate-handler-ndhsp" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.379063 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/72454e30-d40f-408d-93f6-c0cf1ce2f400-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-vgfwc\" (UID: \"72454e30-d40f-408d-93f6-c0cf1ce2f400\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.379085 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/912fc0d4-121a-4073-9e85-a2277a5078d8-nmstate-lock\") pod \"nmstate-handler-ndhsp\" (UID: \"912fc0d4-121a-4073-9e85-a2277a5078d8\") " pod="openshift-nmstate/nmstate-handler-ndhsp" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.379112 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-476vk\" (UniqueName: \"kubernetes.io/projected/912fc0d4-121a-4073-9e85-a2277a5078d8-kube-api-access-476vk\") pod \"nmstate-handler-ndhsp\" (UID: \"912fc0d4-121a-4073-9e85-a2277a5078d8\") " pod="openshift-nmstate/nmstate-handler-ndhsp" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.379135 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/912fc0d4-121a-4073-9e85-a2277a5078d8-dbus-socket\") pod \"nmstate-handler-ndhsp\" (UID: \"912fc0d4-121a-4073-9e85-a2277a5078d8\") " pod="openshift-nmstate/nmstate-handler-ndhsp" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.379157 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/72454e30-d40f-408d-93f6-c0cf1ce2f400-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-vgfwc\" (UID: \"72454e30-d40f-408d-93f6-c0cf1ce2f400\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.379176 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwfhr\" (UniqueName: \"kubernetes.io/projected/72454e30-d40f-408d-93f6-c0cf1ce2f400-kube-api-access-gwfhr\") pod \"nmstate-console-plugin-7fbb5f6569-vgfwc\" (UID: \"72454e30-d40f-408d-93f6-c0cf1ce2f400\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc" Dec 05 20:18:23 crc kubenswrapper[4885]: E1205 20:18:23.379313 4885 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.379346 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/912fc0d4-121a-4073-9e85-a2277a5078d8-ovs-socket\") pod \"nmstate-handler-ndhsp\" (UID: \"912fc0d4-121a-4073-9e85-a2277a5078d8\") " pod="openshift-nmstate/nmstate-handler-ndhsp" Dec 05 20:18:23 crc kubenswrapper[4885]: E1205 20:18:23.379361 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7001b6ac-1126-4d81-9148-47e6f7f830c1-tls-key-pair podName:7001b6ac-1126-4d81-9148-47e6f7f830c1 nodeName:}" failed. No retries permitted until 2025-12-05 20:18:23.879345321 +0000 UTC m=+769.176160982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/7001b6ac-1126-4d81-9148-47e6f7f830c1-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-ph4g7" (UID: "7001b6ac-1126-4d81-9148-47e6f7f830c1") : secret "openshift-nmstate-webhook" not found Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.379313 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/912fc0d4-121a-4073-9e85-a2277a5078d8-nmstate-lock\") pod \"nmstate-handler-ndhsp\" (UID: \"912fc0d4-121a-4073-9e85-a2277a5078d8\") " pod="openshift-nmstate/nmstate-handler-ndhsp" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.379723 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/912fc0d4-121a-4073-9e85-a2277a5078d8-dbus-socket\") pod \"nmstate-handler-ndhsp\" (UID: \"912fc0d4-121a-4073-9e85-a2277a5078d8\") " pod="openshift-nmstate/nmstate-handler-ndhsp" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.400242 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9p8l\" (UniqueName: \"kubernetes.io/projected/7e345d16-e7f9-4881-a031-eb5ef37e22b3-kube-api-access-k9p8l\") pod \"nmstate-metrics-7f946cbc9-lhpld\" (UID: \"7e345d16-e7f9-4881-a031-eb5ef37e22b3\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lhpld" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.400837 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltzxq\" (UniqueName: \"kubernetes.io/projected/7001b6ac-1126-4d81-9148-47e6f7f830c1-kube-api-access-ltzxq\") pod \"nmstate-webhook-5f6d4c5ccb-ph4g7\" (UID: \"7001b6ac-1126-4d81-9148-47e6f7f830c1\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.403161 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-476vk\" (UniqueName: \"kubernetes.io/projected/912fc0d4-121a-4073-9e85-a2277a5078d8-kube-api-access-476vk\") pod \"nmstate-handler-ndhsp\" (UID: \"912fc0d4-121a-4073-9e85-a2277a5078d8\") " pod="openshift-nmstate/nmstate-handler-ndhsp" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.487655 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/72454e30-d40f-408d-93f6-c0cf1ce2f400-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-vgfwc\" (UID: \"72454e30-d40f-408d-93f6-c0cf1ce2f400\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.487739 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/72454e30-d40f-408d-93f6-c0cf1ce2f400-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-vgfwc\" (UID: \"72454e30-d40f-408d-93f6-c0cf1ce2f400\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.487765 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwfhr\" (UniqueName: \"kubernetes.io/projected/72454e30-d40f-408d-93f6-c0cf1ce2f400-kube-api-access-gwfhr\") pod \"nmstate-console-plugin-7fbb5f6569-vgfwc\" (UID: \"72454e30-d40f-408d-93f6-c0cf1ce2f400\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc" Dec 05 20:18:23 crc kubenswrapper[4885]: E1205 20:18:23.487803 4885 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 05 20:18:23 crc kubenswrapper[4885]: E1205 20:18:23.487873 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72454e30-d40f-408d-93f6-c0cf1ce2f400-plugin-serving-cert podName:72454e30-d40f-408d-93f6-c0cf1ce2f400 nodeName:}" failed. No retries permitted until 2025-12-05 20:18:23.987849251 +0000 UTC m=+769.284664912 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/72454e30-d40f-408d-93f6-c0cf1ce2f400-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-vgfwc" (UID: "72454e30-d40f-408d-93f6-c0cf1ce2f400") : secret "plugin-serving-cert" not found Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.488684 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/72454e30-d40f-408d-93f6-c0cf1ce2f400-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-vgfwc\" (UID: \"72454e30-d40f-408d-93f6-c0cf1ce2f400\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.508380 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwfhr\" (UniqueName: \"kubernetes.io/projected/72454e30-d40f-408d-93f6-c0cf1ce2f400-kube-api-access-gwfhr\") pod \"nmstate-console-plugin-7fbb5f6569-vgfwc\" (UID: \"72454e30-d40f-408d-93f6-c0cf1ce2f400\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.532337 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-679bf84676-ts4k5"] Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.533220 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.545483 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-679bf84676-ts4k5"] Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.554959 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lhpld" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.587345 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-ndhsp" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.588630 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/deff7379-aca4-47ae-829b-2e29ce508319-console-oauth-config\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.588702 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/deff7379-aca4-47ae-829b-2e29ce508319-console-serving-cert\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.588723 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/deff7379-aca4-47ae-829b-2e29ce508319-service-ca\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.588752 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/deff7379-aca4-47ae-829b-2e29ce508319-oauth-serving-cert\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.588776 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/deff7379-aca4-47ae-829b-2e29ce508319-console-config\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.588815 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/deff7379-aca4-47ae-829b-2e29ce508319-trusted-ca-bundle\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.588849 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fh8s\" (UniqueName: \"kubernetes.io/projected/deff7379-aca4-47ae-829b-2e29ce508319-kube-api-access-7fh8s\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.691726 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/deff7379-aca4-47ae-829b-2e29ce508319-console-serving-cert\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.691952 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/deff7379-aca4-47ae-829b-2e29ce508319-service-ca\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.691982 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/deff7379-aca4-47ae-829b-2e29ce508319-oauth-serving-cert\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.691998 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/deff7379-aca4-47ae-829b-2e29ce508319-console-config\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.692048 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/deff7379-aca4-47ae-829b-2e29ce508319-trusted-ca-bundle\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.692072 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fh8s\" (UniqueName: \"kubernetes.io/projected/deff7379-aca4-47ae-829b-2e29ce508319-kube-api-access-7fh8s\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.692118 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/deff7379-aca4-47ae-829b-2e29ce508319-console-oauth-config\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.692914 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/deff7379-aca4-47ae-829b-2e29ce508319-service-ca\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.693553 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/deff7379-aca4-47ae-829b-2e29ce508319-trusted-ca-bundle\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.693555 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/deff7379-aca4-47ae-829b-2e29ce508319-oauth-serving-cert\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.695781 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/deff7379-aca4-47ae-829b-2e29ce508319-console-config\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.696316 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/deff7379-aca4-47ae-829b-2e29ce508319-console-serving-cert\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.696441 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/deff7379-aca4-47ae-829b-2e29ce508319-console-oauth-config\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.709540 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fh8s\" (UniqueName: \"kubernetes.io/projected/deff7379-aca4-47ae-829b-2e29ce508319-kube-api-access-7fh8s\") pod \"console-679bf84676-ts4k5\" (UID: \"deff7379-aca4-47ae-829b-2e29ce508319\") " pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.764953 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-lhpld"] Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.793186 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lhpld" event={"ID":"7e345d16-e7f9-4881-a031-eb5ef37e22b3","Type":"ContainerStarted","Data":"7689ba3f755a9e452f3761ea8aefaae100350f7f2e3d0221ec6497607bd8d168"} Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.794405 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-ndhsp" event={"ID":"912fc0d4-121a-4073-9e85-a2277a5078d8","Type":"ContainerStarted","Data":"c5b4298c80327748aac3ba62c75f6d9cb6c9b51b8aa08de85aa7ba5784863df2"} Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.851884 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.894981 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7001b6ac-1126-4d81-9148-47e6f7f830c1-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-ph4g7\" (UID: \"7001b6ac-1126-4d81-9148-47e6f7f830c1\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.901293 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7001b6ac-1126-4d81-9148-47e6f7f830c1-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-ph4g7\" (UID: \"7001b6ac-1126-4d81-9148-47e6f7f830c1\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7" Dec 05 20:18:23 crc kubenswrapper[4885]: I1205 20:18:23.997614 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/72454e30-d40f-408d-93f6-c0cf1ce2f400-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-vgfwc\" (UID: \"72454e30-d40f-408d-93f6-c0cf1ce2f400\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc" Dec 05 20:18:24 crc kubenswrapper[4885]: I1205 20:18:24.002644 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/72454e30-d40f-408d-93f6-c0cf1ce2f400-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-vgfwc\" (UID: \"72454e30-d40f-408d-93f6-c0cf1ce2f400\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc" Dec 05 20:18:24 crc kubenswrapper[4885]: I1205 20:18:24.172123 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7" Dec 05 20:18:24 crc kubenswrapper[4885]: I1205 20:18:24.280499 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc" Dec 05 20:18:24 crc kubenswrapper[4885]: I1205 20:18:24.339865 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-679bf84676-ts4k5"] Dec 05 20:18:24 crc kubenswrapper[4885]: I1205 20:18:24.368570 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7"] Dec 05 20:18:24 crc kubenswrapper[4885]: W1205 20:18:24.382398 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7001b6ac_1126_4d81_9148_47e6f7f830c1.slice/crio-231cecfff66bdcdafa121dea04a81d693cd8976a5c70c4352847784e9428ea7d WatchSource:0}: Error finding container 231cecfff66bdcdafa121dea04a81d693cd8976a5c70c4352847784e9428ea7d: Status 404 returned error can't find the container with id 231cecfff66bdcdafa121dea04a81d693cd8976a5c70c4352847784e9428ea7d Dec 05 20:18:24 crc kubenswrapper[4885]: I1205 20:18:24.476409 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc"] Dec 05 20:18:24 crc kubenswrapper[4885]: I1205 20:18:24.803230 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc" event={"ID":"72454e30-d40f-408d-93f6-c0cf1ce2f400","Type":"ContainerStarted","Data":"bc93b56a0e58609b663824675420ce0f3038140536903e2ece6d13b764ffb106"} Dec 05 20:18:24 crc kubenswrapper[4885]: I1205 20:18:24.804200 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7" event={"ID":"7001b6ac-1126-4d81-9148-47e6f7f830c1","Type":"ContainerStarted","Data":"231cecfff66bdcdafa121dea04a81d693cd8976a5c70c4352847784e9428ea7d"} Dec 05 20:18:24 crc kubenswrapper[4885]: I1205 20:18:24.805931 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-679bf84676-ts4k5" event={"ID":"deff7379-aca4-47ae-829b-2e29ce508319","Type":"ContainerStarted","Data":"a46b9c1ed30e95b6f0f96aa5293c053daad98af7a5cd6cd65de99c1ccef5e5fc"} Dec 05 20:18:24 crc kubenswrapper[4885]: I1205 20:18:24.805966 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-679bf84676-ts4k5" event={"ID":"deff7379-aca4-47ae-829b-2e29ce508319","Type":"ContainerStarted","Data":"5bf6ccdc3d4c7a12b79e74d5eee6369c9874e373e13d21c6736f0d9fe729909e"} Dec 05 20:18:24 crc kubenswrapper[4885]: I1205 20:18:24.824242 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-679bf84676-ts4k5" podStartSLOduration=1.824223505 podStartE2EDuration="1.824223505s" podCreationTimestamp="2025-12-05 20:18:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:18:24.819407644 +0000 UTC m=+770.116223345" watchObservedRunningTime="2025-12-05 20:18:24.824223505 +0000 UTC m=+770.121039166" Dec 05 20:18:27 crc kubenswrapper[4885]: I1205 20:18:27.830819 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc" event={"ID":"72454e30-d40f-408d-93f6-c0cf1ce2f400","Type":"ContainerStarted","Data":"3c1d50184858e26130f1c8337d33c2365d6f13277e0e7edd7d1559828bc797c7"} Dec 05 20:18:27 crc kubenswrapper[4885]: I1205 20:18:27.832782 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-ndhsp" event={"ID":"912fc0d4-121a-4073-9e85-a2277a5078d8","Type":"ContainerStarted","Data":"9c20a75470205fd09355bd24221c89618aa0fc352d2a3201599719da117926a0"} Dec 05 20:18:27 crc kubenswrapper[4885]: I1205 20:18:27.833397 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-ndhsp" Dec 05 20:18:27 crc kubenswrapper[4885]: I1205 20:18:27.835871 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lhpld" event={"ID":"7e345d16-e7f9-4881-a031-eb5ef37e22b3","Type":"ContainerStarted","Data":"bfa2dbd0862f23d142ed90244499e10797cbc9ccc0f810daa8592b09d7e15b45"} Dec 05 20:18:27 crc kubenswrapper[4885]: I1205 20:18:27.837392 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7" event={"ID":"7001b6ac-1126-4d81-9148-47e6f7f830c1","Type":"ContainerStarted","Data":"0289b74b1399fbb32bcd6a330d39dc23a8f666b5cc6c9eb43edb5ba5d6306edb"} Dec 05 20:18:27 crc kubenswrapper[4885]: I1205 20:18:27.837994 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7" Dec 05 20:18:27 crc kubenswrapper[4885]: I1205 20:18:27.872380 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-vgfwc" podStartSLOduration=2.114955454 podStartE2EDuration="4.872361756s" podCreationTimestamp="2025-12-05 20:18:23 +0000 UTC" firstStartedPulling="2025-12-05 20:18:24.491799878 +0000 UTC m=+769.788615539" lastFinishedPulling="2025-12-05 20:18:27.24920618 +0000 UTC m=+772.546021841" observedRunningTime="2025-12-05 20:18:27.846897188 +0000 UTC m=+773.143712849" watchObservedRunningTime="2025-12-05 20:18:27.872361756 +0000 UTC m=+773.169177417" Dec 05 20:18:27 crc kubenswrapper[4885]: I1205 20:18:27.879917 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7" podStartSLOduration=2.012562418 podStartE2EDuration="4.879897772s" podCreationTimestamp="2025-12-05 20:18:23 +0000 UTC" firstStartedPulling="2025-12-05 20:18:24.384796047 +0000 UTC m=+769.681611708" lastFinishedPulling="2025-12-05 20:18:27.252131371 +0000 UTC m=+772.548947062" observedRunningTime="2025-12-05 20:18:27.867518635 +0000 UTC m=+773.164334316" watchObservedRunningTime="2025-12-05 20:18:27.879897772 +0000 UTC m=+773.176713443" Dec 05 20:18:27 crc kubenswrapper[4885]: I1205 20:18:27.893306 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-ndhsp" podStartSLOduration=1.2366640740000001 podStartE2EDuration="4.893283401s" podCreationTimestamp="2025-12-05 20:18:23 +0000 UTC" firstStartedPulling="2025-12-05 20:18:23.611905048 +0000 UTC m=+768.908720709" lastFinishedPulling="2025-12-05 20:18:27.268524375 +0000 UTC m=+772.565340036" observedRunningTime="2025-12-05 20:18:27.891543787 +0000 UTC m=+773.188359478" watchObservedRunningTime="2025-12-05 20:18:27.893283401 +0000 UTC m=+773.190099082" Dec 05 20:18:29 crc kubenswrapper[4885]: I1205 20:18:29.851653 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lhpld" event={"ID":"7e345d16-e7f9-4881-a031-eb5ef37e22b3","Type":"ContainerStarted","Data":"f826eb93812305f6f697e9e27c442d9f0faa850e685ef5c7b8e86a6e35f50977"} Dec 05 20:18:29 crc kubenswrapper[4885]: I1205 20:18:29.874244 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lhpld" podStartSLOduration=1.287627452 podStartE2EDuration="6.874225534s" podCreationTimestamp="2025-12-05 20:18:23 +0000 UTC" firstStartedPulling="2025-12-05 20:18:23.762823977 +0000 UTC m=+769.059639638" lastFinishedPulling="2025-12-05 20:18:29.349422039 +0000 UTC m=+774.646237720" observedRunningTime="2025-12-05 20:18:29.873466599 +0000 UTC m=+775.170282270" watchObservedRunningTime="2025-12-05 20:18:29.874225534 +0000 UTC m=+775.171041195" Dec 05 20:18:33 crc kubenswrapper[4885]: I1205 20:18:33.627453 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-ndhsp" Dec 05 20:18:33 crc kubenswrapper[4885]: I1205 20:18:33.852638 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:33 crc kubenswrapper[4885]: I1205 20:18:33.852707 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:33 crc kubenswrapper[4885]: I1205 20:18:33.857304 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:33 crc kubenswrapper[4885]: I1205 20:18:33.880206 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-679bf84676-ts4k5" Dec 05 20:18:33 crc kubenswrapper[4885]: I1205 20:18:33.939032 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-jdrlk"] Dec 05 20:18:44 crc kubenswrapper[4885]: I1205 20:18:44.182903 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph4g7" Dec 05 20:18:46 crc kubenswrapper[4885]: I1205 20:18:46.630861 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:18:46 crc kubenswrapper[4885]: I1205 20:18:46.631222 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:18:46 crc kubenswrapper[4885]: I1205 20:18:46.631275 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:18:46 crc kubenswrapper[4885]: I1205 20:18:46.631851 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a54d873f48017e0ab1882609207d2134ae0f9e98ed286e2389ccf25d46ab55d"} pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:18:46 crc kubenswrapper[4885]: I1205 20:18:46.631918 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" containerID="cri-o://3a54d873f48017e0ab1882609207d2134ae0f9e98ed286e2389ccf25d46ab55d" gracePeriod=600 Dec 05 20:18:46 crc kubenswrapper[4885]: I1205 20:18:46.991088 4885 generic.go:334] "Generic (PLEG): container finished" podID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerID="3a54d873f48017e0ab1882609207d2134ae0f9e98ed286e2389ccf25d46ab55d" exitCode=0 Dec 05 20:18:46 crc kubenswrapper[4885]: I1205 20:18:46.991169 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerDied","Data":"3a54d873f48017e0ab1882609207d2134ae0f9e98ed286e2389ccf25d46ab55d"} Dec 05 20:18:46 crc kubenswrapper[4885]: I1205 20:18:46.991381 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"838d57f53907a18978ccf285771525c5f73a2f0a8cab487f678fbc79c5b8663f"} Dec 05 20:18:46 crc kubenswrapper[4885]: I1205 20:18:46.991409 4885 scope.go:117] "RemoveContainer" containerID="53112215960c3263d15b18ec4571f7146c46646867b9a8f4171bc569cf2437c9" Dec 05 20:18:58 crc kubenswrapper[4885]: I1205 20:18:58.978866 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-jdrlk" podUID="543415d6-6aec-42f4-953f-3a760aefe1f2" containerName="console" containerID="cri-o://aafdaefedfa89069116c7ab58f2ef94860dbf82afd838806c9388a9f47f0f829" gracePeriod=15 Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.462045 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m"] Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.475811 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.488482 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m"] Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.523663 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.633072 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8krk\" (UniqueName: \"kubernetes.io/projected/2799bcd8-694a-4fdc-b243-2780761ecda7-kube-api-access-k8krk\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m\" (UID: \"2799bcd8-694a-4fdc-b243-2780761ecda7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.633141 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2799bcd8-694a-4fdc-b243-2780761ecda7-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m\" (UID: \"2799bcd8-694a-4fdc-b243-2780761ecda7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.633235 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2799bcd8-694a-4fdc-b243-2780761ecda7-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m\" (UID: \"2799bcd8-694a-4fdc-b243-2780761ecda7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.734410 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2799bcd8-694a-4fdc-b243-2780761ecda7-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m\" (UID: \"2799bcd8-694a-4fdc-b243-2780761ecda7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.734872 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2799bcd8-694a-4fdc-b243-2780761ecda7-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m\" (UID: \"2799bcd8-694a-4fdc-b243-2780761ecda7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.735001 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8krk\" (UniqueName: \"kubernetes.io/projected/2799bcd8-694a-4fdc-b243-2780761ecda7-kube-api-access-k8krk\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m\" (UID: \"2799bcd8-694a-4fdc-b243-2780761ecda7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.735385 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2799bcd8-694a-4fdc-b243-2780761ecda7-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m\" (UID: \"2799bcd8-694a-4fdc-b243-2780761ecda7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.735585 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2799bcd8-694a-4fdc-b243-2780761ecda7-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m\" (UID: \"2799bcd8-694a-4fdc-b243-2780761ecda7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.767303 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8krk\" (UniqueName: \"kubernetes.io/projected/2799bcd8-694a-4fdc-b243-2780761ecda7-kube-api-access-k8krk\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m\" (UID: \"2799bcd8-694a-4fdc-b243-2780761ecda7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.835660 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-jdrlk_543415d6-6aec-42f4-953f-3a760aefe1f2/console/0.log" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.835727 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.845649 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.937296 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6986h\" (UniqueName: \"kubernetes.io/projected/543415d6-6aec-42f4-953f-3a760aefe1f2-kube-api-access-6986h\") pod \"543415d6-6aec-42f4-953f-3a760aefe1f2\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.937352 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-oauth-serving-cert\") pod \"543415d6-6aec-42f4-953f-3a760aefe1f2\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.937396 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-console-config\") pod \"543415d6-6aec-42f4-953f-3a760aefe1f2\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.937444 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-trusted-ca-bundle\") pod \"543415d6-6aec-42f4-953f-3a760aefe1f2\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.937495 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-service-ca\") pod \"543415d6-6aec-42f4-953f-3a760aefe1f2\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.937520 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/543415d6-6aec-42f4-953f-3a760aefe1f2-console-serving-cert\") pod \"543415d6-6aec-42f4-953f-3a760aefe1f2\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.937556 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/543415d6-6aec-42f4-953f-3a760aefe1f2-console-oauth-config\") pod \"543415d6-6aec-42f4-953f-3a760aefe1f2\" (UID: \"543415d6-6aec-42f4-953f-3a760aefe1f2\") " Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.938777 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "543415d6-6aec-42f4-953f-3a760aefe1f2" (UID: "543415d6-6aec-42f4-953f-3a760aefe1f2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.939211 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-service-ca" (OuterVolumeSpecName: "service-ca") pod "543415d6-6aec-42f4-953f-3a760aefe1f2" (UID: "543415d6-6aec-42f4-953f-3a760aefe1f2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.939364 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "543415d6-6aec-42f4-953f-3a760aefe1f2" (UID: "543415d6-6aec-42f4-953f-3a760aefe1f2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.939908 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-console-config" (OuterVolumeSpecName: "console-config") pod "543415d6-6aec-42f4-953f-3a760aefe1f2" (UID: "543415d6-6aec-42f4-953f-3a760aefe1f2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.943996 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/543415d6-6aec-42f4-953f-3a760aefe1f2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "543415d6-6aec-42f4-953f-3a760aefe1f2" (UID: "543415d6-6aec-42f4-953f-3a760aefe1f2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.944499 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/543415d6-6aec-42f4-953f-3a760aefe1f2-kube-api-access-6986h" (OuterVolumeSpecName: "kube-api-access-6986h") pod "543415d6-6aec-42f4-953f-3a760aefe1f2" (UID: "543415d6-6aec-42f4-953f-3a760aefe1f2"). InnerVolumeSpecName "kube-api-access-6986h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:18:59 crc kubenswrapper[4885]: I1205 20:18:59.946092 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/543415d6-6aec-42f4-953f-3a760aefe1f2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "543415d6-6aec-42f4-953f-3a760aefe1f2" (UID: "543415d6-6aec-42f4-953f-3a760aefe1f2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.032876 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m"] Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.038337 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.038369 4885 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.038380 4885 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/543415d6-6aec-42f4-953f-3a760aefe1f2-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.038391 4885 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/543415d6-6aec-42f4-953f-3a760aefe1f2-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.038402 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6986h\" (UniqueName: \"kubernetes.io/projected/543415d6-6aec-42f4-953f-3a760aefe1f2-kube-api-access-6986h\") on node \"crc\" DevicePath \"\"" Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.038413 4885 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.038423 4885 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/543415d6-6aec-42f4-953f-3a760aefe1f2-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.064098 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" event={"ID":"2799bcd8-694a-4fdc-b243-2780761ecda7","Type":"ContainerStarted","Data":"b94ac3e3970cc440386d91852fccf7429b134ce7d64528f8db407933d45a4c1e"} Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.065830 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-jdrlk_543415d6-6aec-42f4-953f-3a760aefe1f2/console/0.log" Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.065867 4885 generic.go:334] "Generic (PLEG): container finished" podID="543415d6-6aec-42f4-953f-3a760aefe1f2" containerID="aafdaefedfa89069116c7ab58f2ef94860dbf82afd838806c9388a9f47f0f829" exitCode=2 Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.065893 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jdrlk" event={"ID":"543415d6-6aec-42f4-953f-3a760aefe1f2","Type":"ContainerDied","Data":"aafdaefedfa89069116c7ab58f2ef94860dbf82afd838806c9388a9f47f0f829"} Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.065910 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jdrlk" event={"ID":"543415d6-6aec-42f4-953f-3a760aefe1f2","Type":"ContainerDied","Data":"d959286b80431f0d2f2ea0a360c5c69d30c4ece645c4b1550f0c2521a6d077a7"} Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.065927 4885 scope.go:117] "RemoveContainer" containerID="aafdaefedfa89069116c7ab58f2ef94860dbf82afd838806c9388a9f47f0f829" Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.066080 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jdrlk" Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.098928 4885 scope.go:117] "RemoveContainer" containerID="aafdaefedfa89069116c7ab58f2ef94860dbf82afd838806c9388a9f47f0f829" Dec 05 20:19:00 crc kubenswrapper[4885]: E1205 20:19:00.099414 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aafdaefedfa89069116c7ab58f2ef94860dbf82afd838806c9388a9f47f0f829\": container with ID starting with aafdaefedfa89069116c7ab58f2ef94860dbf82afd838806c9388a9f47f0f829 not found: ID does not exist" containerID="aafdaefedfa89069116c7ab58f2ef94860dbf82afd838806c9388a9f47f0f829" Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.099442 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aafdaefedfa89069116c7ab58f2ef94860dbf82afd838806c9388a9f47f0f829"} err="failed to get container status \"aafdaefedfa89069116c7ab58f2ef94860dbf82afd838806c9388a9f47f0f829\": rpc error: code = NotFound desc = could not find container \"aafdaefedfa89069116c7ab58f2ef94860dbf82afd838806c9388a9f47f0f829\": container with ID starting with aafdaefedfa89069116c7ab58f2ef94860dbf82afd838806c9388a9f47f0f829 not found: ID does not exist" Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.101282 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-jdrlk"] Dec 05 20:19:00 crc kubenswrapper[4885]: I1205 20:19:00.104820 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-jdrlk"] Dec 05 20:19:01 crc kubenswrapper[4885]: I1205 20:19:01.072764 4885 generic.go:334] "Generic (PLEG): container finished" podID="2799bcd8-694a-4fdc-b243-2780761ecda7" containerID="095b7013b71848dffdde2327bb540c1911a3f517f3bdb1ae8e6b7ef4dabcf689" exitCode=0 Dec 05 20:19:01 crc kubenswrapper[4885]: I1205 20:19:01.072852 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" event={"ID":"2799bcd8-694a-4fdc-b243-2780761ecda7","Type":"ContainerDied","Data":"095b7013b71848dffdde2327bb540c1911a3f517f3bdb1ae8e6b7ef4dabcf689"} Dec 05 20:19:01 crc kubenswrapper[4885]: I1205 20:19:01.188518 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="543415d6-6aec-42f4-953f-3a760aefe1f2" path="/var/lib/kubelet/pods/543415d6-6aec-42f4-953f-3a760aefe1f2/volumes" Dec 05 20:19:02 crc kubenswrapper[4885]: I1205 20:19:02.793430 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xjr7f"] Dec 05 20:19:02 crc kubenswrapper[4885]: E1205 20:19:02.793910 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543415d6-6aec-42f4-953f-3a760aefe1f2" containerName="console" Dec 05 20:19:02 crc kubenswrapper[4885]: I1205 20:19:02.793923 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="543415d6-6aec-42f4-953f-3a760aefe1f2" containerName="console" Dec 05 20:19:02 crc kubenswrapper[4885]: I1205 20:19:02.794041 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="543415d6-6aec-42f4-953f-3a760aefe1f2" containerName="console" Dec 05 20:19:02 crc kubenswrapper[4885]: I1205 20:19:02.794728 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:02 crc kubenswrapper[4885]: I1205 20:19:02.837224 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xjr7f"] Dec 05 20:19:02 crc kubenswrapper[4885]: I1205 20:19:02.880394 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-catalog-content\") pod \"redhat-operators-xjr7f\" (UID: \"44bfe762-5b1a-4054-8a4d-deb19abb3e2b\") " pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:02 crc kubenswrapper[4885]: I1205 20:19:02.880437 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j68l7\" (UniqueName: \"kubernetes.io/projected/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-kube-api-access-j68l7\") pod \"redhat-operators-xjr7f\" (UID: \"44bfe762-5b1a-4054-8a4d-deb19abb3e2b\") " pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:02 crc kubenswrapper[4885]: I1205 20:19:02.880477 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-utilities\") pod \"redhat-operators-xjr7f\" (UID: \"44bfe762-5b1a-4054-8a4d-deb19abb3e2b\") " pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:02 crc kubenswrapper[4885]: I1205 20:19:02.981422 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-catalog-content\") pod \"redhat-operators-xjr7f\" (UID: \"44bfe762-5b1a-4054-8a4d-deb19abb3e2b\") " pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:02 crc kubenswrapper[4885]: I1205 20:19:02.981489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j68l7\" (UniqueName: \"kubernetes.io/projected/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-kube-api-access-j68l7\") pod \"redhat-operators-xjr7f\" (UID: \"44bfe762-5b1a-4054-8a4d-deb19abb3e2b\") " pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:02 crc kubenswrapper[4885]: I1205 20:19:02.981567 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-utilities\") pod \"redhat-operators-xjr7f\" (UID: \"44bfe762-5b1a-4054-8a4d-deb19abb3e2b\") " pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:02 crc kubenswrapper[4885]: I1205 20:19:02.982040 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-catalog-content\") pod \"redhat-operators-xjr7f\" (UID: \"44bfe762-5b1a-4054-8a4d-deb19abb3e2b\") " pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:02 crc kubenswrapper[4885]: I1205 20:19:02.982207 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-utilities\") pod \"redhat-operators-xjr7f\" (UID: \"44bfe762-5b1a-4054-8a4d-deb19abb3e2b\") " pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:03 crc kubenswrapper[4885]: I1205 20:19:03.005352 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j68l7\" (UniqueName: \"kubernetes.io/projected/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-kube-api-access-j68l7\") pod \"redhat-operators-xjr7f\" (UID: \"44bfe762-5b1a-4054-8a4d-deb19abb3e2b\") " pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:03 crc kubenswrapper[4885]: I1205 20:19:03.089567 4885 generic.go:334] "Generic (PLEG): container finished" podID="2799bcd8-694a-4fdc-b243-2780761ecda7" containerID="6911edc3e46a3c872f25329486d7b0b27f755f771944b701bec5a321fc8c0925" exitCode=0 Dec 05 20:19:03 crc kubenswrapper[4885]: I1205 20:19:03.089620 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" event={"ID":"2799bcd8-694a-4fdc-b243-2780761ecda7","Type":"ContainerDied","Data":"6911edc3e46a3c872f25329486d7b0b27f755f771944b701bec5a321fc8c0925"} Dec 05 20:19:03 crc kubenswrapper[4885]: I1205 20:19:03.165117 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:03 crc kubenswrapper[4885]: W1205 20:19:03.428275 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44bfe762_5b1a_4054_8a4d_deb19abb3e2b.slice/crio-00109434d972c6dd18041b04000e02f1abca4cdb57b434aca9525f87b178d0de WatchSource:0}: Error finding container 00109434d972c6dd18041b04000e02f1abca4cdb57b434aca9525f87b178d0de: Status 404 returned error can't find the container with id 00109434d972c6dd18041b04000e02f1abca4cdb57b434aca9525f87b178d0de Dec 05 20:19:03 crc kubenswrapper[4885]: I1205 20:19:03.428404 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xjr7f"] Dec 05 20:19:04 crc kubenswrapper[4885]: I1205 20:19:04.098552 4885 generic.go:334] "Generic (PLEG): container finished" podID="2799bcd8-694a-4fdc-b243-2780761ecda7" containerID="d163996871366a6370e9e4713459ce0192f8954018c0f8f1692a14283cd6d439" exitCode=0 Dec 05 20:19:04 crc kubenswrapper[4885]: I1205 20:19:04.098697 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" event={"ID":"2799bcd8-694a-4fdc-b243-2780761ecda7","Type":"ContainerDied","Data":"d163996871366a6370e9e4713459ce0192f8954018c0f8f1692a14283cd6d439"} Dec 05 20:19:04 crc kubenswrapper[4885]: I1205 20:19:04.100786 4885 generic.go:334] "Generic (PLEG): container finished" podID="44bfe762-5b1a-4054-8a4d-deb19abb3e2b" containerID="4403d51b0bb884246d666831ee867e744459d0d0fee49ef3fbf72164bdda0942" exitCode=0 Dec 05 20:19:04 crc kubenswrapper[4885]: I1205 20:19:04.100841 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjr7f" event={"ID":"44bfe762-5b1a-4054-8a4d-deb19abb3e2b","Type":"ContainerDied","Data":"4403d51b0bb884246d666831ee867e744459d0d0fee49ef3fbf72164bdda0942"} Dec 05 20:19:04 crc kubenswrapper[4885]: I1205 20:19:04.100873 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjr7f" event={"ID":"44bfe762-5b1a-4054-8a4d-deb19abb3e2b","Type":"ContainerStarted","Data":"00109434d972c6dd18041b04000e02f1abca4cdb57b434aca9525f87b178d0de"} Dec 05 20:19:05 crc kubenswrapper[4885]: I1205 20:19:05.109621 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjr7f" event={"ID":"44bfe762-5b1a-4054-8a4d-deb19abb3e2b","Type":"ContainerStarted","Data":"4467ed87b913cfc95145ca5d5ea8dbe2ab0e97ca755409dbfded9e1cf6c13b46"} Dec 05 20:19:05 crc kubenswrapper[4885]: E1205 20:19:05.341505 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/NetworkManager-dispatcher.service\": RecentStats: unable to find data in memory cache]" Dec 05 20:19:05 crc kubenswrapper[4885]: I1205 20:19:05.395401 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" Dec 05 20:19:05 crc kubenswrapper[4885]: I1205 20:19:05.524077 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8krk\" (UniqueName: \"kubernetes.io/projected/2799bcd8-694a-4fdc-b243-2780761ecda7-kube-api-access-k8krk\") pod \"2799bcd8-694a-4fdc-b243-2780761ecda7\" (UID: \"2799bcd8-694a-4fdc-b243-2780761ecda7\") " Dec 05 20:19:05 crc kubenswrapper[4885]: I1205 20:19:05.524199 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2799bcd8-694a-4fdc-b243-2780761ecda7-util\") pod \"2799bcd8-694a-4fdc-b243-2780761ecda7\" (UID: \"2799bcd8-694a-4fdc-b243-2780761ecda7\") " Dec 05 20:19:05 crc kubenswrapper[4885]: I1205 20:19:05.524218 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2799bcd8-694a-4fdc-b243-2780761ecda7-bundle\") pod \"2799bcd8-694a-4fdc-b243-2780761ecda7\" (UID: \"2799bcd8-694a-4fdc-b243-2780761ecda7\") " Dec 05 20:19:05 crc kubenswrapper[4885]: I1205 20:19:05.525263 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2799bcd8-694a-4fdc-b243-2780761ecda7-bundle" (OuterVolumeSpecName: "bundle") pod "2799bcd8-694a-4fdc-b243-2780761ecda7" (UID: "2799bcd8-694a-4fdc-b243-2780761ecda7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:19:05 crc kubenswrapper[4885]: I1205 20:19:05.532710 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2799bcd8-694a-4fdc-b243-2780761ecda7-kube-api-access-k8krk" (OuterVolumeSpecName: "kube-api-access-k8krk") pod "2799bcd8-694a-4fdc-b243-2780761ecda7" (UID: "2799bcd8-694a-4fdc-b243-2780761ecda7"). InnerVolumeSpecName "kube-api-access-k8krk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:19:05 crc kubenswrapper[4885]: I1205 20:19:05.626298 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8krk\" (UniqueName: \"kubernetes.io/projected/2799bcd8-694a-4fdc-b243-2780761ecda7-kube-api-access-k8krk\") on node \"crc\" DevicePath \"\"" Dec 05 20:19:05 crc kubenswrapper[4885]: I1205 20:19:05.626347 4885 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2799bcd8-694a-4fdc-b243-2780761ecda7-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:19:05 crc kubenswrapper[4885]: I1205 20:19:05.675822 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2799bcd8-694a-4fdc-b243-2780761ecda7-util" (OuterVolumeSpecName: "util") pod "2799bcd8-694a-4fdc-b243-2780761ecda7" (UID: "2799bcd8-694a-4fdc-b243-2780761ecda7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:19:05 crc kubenswrapper[4885]: I1205 20:19:05.727167 4885 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2799bcd8-694a-4fdc-b243-2780761ecda7-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:19:06 crc kubenswrapper[4885]: I1205 20:19:06.121083 4885 generic.go:334] "Generic (PLEG): container finished" podID="44bfe762-5b1a-4054-8a4d-deb19abb3e2b" containerID="4467ed87b913cfc95145ca5d5ea8dbe2ab0e97ca755409dbfded9e1cf6c13b46" exitCode=0 Dec 05 20:19:06 crc kubenswrapper[4885]: I1205 20:19:06.121241 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjr7f" event={"ID":"44bfe762-5b1a-4054-8a4d-deb19abb3e2b","Type":"ContainerDied","Data":"4467ed87b913cfc95145ca5d5ea8dbe2ab0e97ca755409dbfded9e1cf6c13b46"} Dec 05 20:19:06 crc kubenswrapper[4885]: I1205 20:19:06.124921 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" event={"ID":"2799bcd8-694a-4fdc-b243-2780761ecda7","Type":"ContainerDied","Data":"b94ac3e3970cc440386d91852fccf7429b134ce7d64528f8db407933d45a4c1e"} Dec 05 20:19:06 crc kubenswrapper[4885]: I1205 20:19:06.124953 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b94ac3e3970cc440386d91852fccf7429b134ce7d64528f8db407933d45a4c1e" Dec 05 20:19:06 crc kubenswrapper[4885]: I1205 20:19:06.124976 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m" Dec 05 20:19:07 crc kubenswrapper[4885]: I1205 20:19:07.134741 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjr7f" event={"ID":"44bfe762-5b1a-4054-8a4d-deb19abb3e2b","Type":"ContainerStarted","Data":"7328bd7ee5f929175127d4afd910a683c9a388d704689469a2ffa5cecd9fbe39"} Dec 05 20:19:07 crc kubenswrapper[4885]: I1205 20:19:07.155871 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xjr7f" podStartSLOduration=2.448764508 podStartE2EDuration="5.155848425s" podCreationTimestamp="2025-12-05 20:19:02 +0000 UTC" firstStartedPulling="2025-12-05 20:19:04.103202487 +0000 UTC m=+809.400018158" lastFinishedPulling="2025-12-05 20:19:06.810286374 +0000 UTC m=+812.107102075" observedRunningTime="2025-12-05 20:19:07.153149911 +0000 UTC m=+812.449965602" watchObservedRunningTime="2025-12-05 20:19:07.155848425 +0000 UTC m=+812.452664096" Dec 05 20:19:13 crc kubenswrapper[4885]: I1205 20:19:13.165365 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:13 crc kubenswrapper[4885]: I1205 20:19:13.165832 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:14 crc kubenswrapper[4885]: I1205 20:19:14.243703 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xjr7f" podUID="44bfe762-5b1a-4054-8a4d-deb19abb3e2b" containerName="registry-server" probeResult="failure" output=< Dec 05 20:19:14 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Dec 05 20:19:14 crc kubenswrapper[4885]: > Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.284905 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7"] Dec 05 20:19:19 crc kubenswrapper[4885]: E1205 20:19:19.285432 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2799bcd8-694a-4fdc-b243-2780761ecda7" containerName="pull" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.285453 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2799bcd8-694a-4fdc-b243-2780761ecda7" containerName="pull" Dec 05 20:19:19 crc kubenswrapper[4885]: E1205 20:19:19.285462 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2799bcd8-694a-4fdc-b243-2780761ecda7" containerName="extract" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.285468 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2799bcd8-694a-4fdc-b243-2780761ecda7" containerName="extract" Dec 05 20:19:19 crc kubenswrapper[4885]: E1205 20:19:19.285484 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2799bcd8-694a-4fdc-b243-2780761ecda7" containerName="util" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.285490 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2799bcd8-694a-4fdc-b243-2780761ecda7" containerName="util" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.285583 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2799bcd8-694a-4fdc-b243-2780761ecda7" containerName="extract" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.285948 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.288664 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.288826 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7jpm6" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.288944 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.289784 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.289826 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.302727 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7"] Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.402781 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv9nn\" (UniqueName: \"kubernetes.io/projected/dd4c62d1-80af-4d61-bc04-6ac5c8259121-kube-api-access-fv9nn\") pod \"metallb-operator-controller-manager-fb9f8748-k8dk7\" (UID: \"dd4c62d1-80af-4d61-bc04-6ac5c8259121\") " pod="metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.402849 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dd4c62d1-80af-4d61-bc04-6ac5c8259121-apiservice-cert\") pod \"metallb-operator-controller-manager-fb9f8748-k8dk7\" (UID: \"dd4c62d1-80af-4d61-bc04-6ac5c8259121\") " pod="metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.402938 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dd4c62d1-80af-4d61-bc04-6ac5c8259121-webhook-cert\") pod \"metallb-operator-controller-manager-fb9f8748-k8dk7\" (UID: \"dd4c62d1-80af-4d61-bc04-6ac5c8259121\") " pod="metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.504146 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dd4c62d1-80af-4d61-bc04-6ac5c8259121-webhook-cert\") pod \"metallb-operator-controller-manager-fb9f8748-k8dk7\" (UID: \"dd4c62d1-80af-4d61-bc04-6ac5c8259121\") " pod="metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.504259 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv9nn\" (UniqueName: \"kubernetes.io/projected/dd4c62d1-80af-4d61-bc04-6ac5c8259121-kube-api-access-fv9nn\") pod \"metallb-operator-controller-manager-fb9f8748-k8dk7\" (UID: \"dd4c62d1-80af-4d61-bc04-6ac5c8259121\") " pod="metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.504300 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dd4c62d1-80af-4d61-bc04-6ac5c8259121-apiservice-cert\") pod \"metallb-operator-controller-manager-fb9f8748-k8dk7\" (UID: \"dd4c62d1-80af-4d61-bc04-6ac5c8259121\") " pod="metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.511736 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dd4c62d1-80af-4d61-bc04-6ac5c8259121-apiservice-cert\") pod \"metallb-operator-controller-manager-fb9f8748-k8dk7\" (UID: \"dd4c62d1-80af-4d61-bc04-6ac5c8259121\") " pod="metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.517918 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dd4c62d1-80af-4d61-bc04-6ac5c8259121-webhook-cert\") pod \"metallb-operator-controller-manager-fb9f8748-k8dk7\" (UID: \"dd4c62d1-80af-4d61-bc04-6ac5c8259121\") " pod="metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.525741 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv9nn\" (UniqueName: \"kubernetes.io/projected/dd4c62d1-80af-4d61-bc04-6ac5c8259121-kube-api-access-fv9nn\") pod \"metallb-operator-controller-manager-fb9f8748-k8dk7\" (UID: \"dd4c62d1-80af-4d61-bc04-6ac5c8259121\") " pod="metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.532725 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh"] Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.533802 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.536457 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-wct6g" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.536702 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.536968 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.549480 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh"] Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.602567 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.706648 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9vt9\" (UniqueName: \"kubernetes.io/projected/8263fedc-0c2a-4de8-8d5c-47aa32b745ee-kube-api-access-v9vt9\") pod \"metallb-operator-webhook-server-5dcf889d57-wtshh\" (UID: \"8263fedc-0c2a-4de8-8d5c-47aa32b745ee\") " pod="metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.706712 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8263fedc-0c2a-4de8-8d5c-47aa32b745ee-webhook-cert\") pod \"metallb-operator-webhook-server-5dcf889d57-wtshh\" (UID: \"8263fedc-0c2a-4de8-8d5c-47aa32b745ee\") " pod="metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.706750 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8263fedc-0c2a-4de8-8d5c-47aa32b745ee-apiservice-cert\") pod \"metallb-operator-webhook-server-5dcf889d57-wtshh\" (UID: \"8263fedc-0c2a-4de8-8d5c-47aa32b745ee\") " pod="metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.808290 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9vt9\" (UniqueName: \"kubernetes.io/projected/8263fedc-0c2a-4de8-8d5c-47aa32b745ee-kube-api-access-v9vt9\") pod \"metallb-operator-webhook-server-5dcf889d57-wtshh\" (UID: \"8263fedc-0c2a-4de8-8d5c-47aa32b745ee\") " pod="metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.808621 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8263fedc-0c2a-4de8-8d5c-47aa32b745ee-webhook-cert\") pod \"metallb-operator-webhook-server-5dcf889d57-wtshh\" (UID: \"8263fedc-0c2a-4de8-8d5c-47aa32b745ee\") " pod="metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.808785 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8263fedc-0c2a-4de8-8d5c-47aa32b745ee-apiservice-cert\") pod \"metallb-operator-webhook-server-5dcf889d57-wtshh\" (UID: \"8263fedc-0c2a-4de8-8d5c-47aa32b745ee\") " pod="metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.815186 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8263fedc-0c2a-4de8-8d5c-47aa32b745ee-apiservice-cert\") pod \"metallb-operator-webhook-server-5dcf889d57-wtshh\" (UID: \"8263fedc-0c2a-4de8-8d5c-47aa32b745ee\") " pod="metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.826880 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9vt9\" (UniqueName: \"kubernetes.io/projected/8263fedc-0c2a-4de8-8d5c-47aa32b745ee-kube-api-access-v9vt9\") pod \"metallb-operator-webhook-server-5dcf889d57-wtshh\" (UID: \"8263fedc-0c2a-4de8-8d5c-47aa32b745ee\") " pod="metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.828641 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8263fedc-0c2a-4de8-8d5c-47aa32b745ee-webhook-cert\") pod \"metallb-operator-webhook-server-5dcf889d57-wtshh\" (UID: \"8263fedc-0c2a-4de8-8d5c-47aa32b745ee\") " pod="metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh" Dec 05 20:19:19 crc kubenswrapper[4885]: I1205 20:19:19.862247 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh" Dec 05 20:19:20 crc kubenswrapper[4885]: I1205 20:19:20.082706 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7"] Dec 05 20:19:20 crc kubenswrapper[4885]: I1205 20:19:20.139938 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh"] Dec 05 20:19:20 crc kubenswrapper[4885]: W1205 20:19:20.142051 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8263fedc_0c2a_4de8_8d5c_47aa32b745ee.slice/crio-1e9145bb6047ec9f67313797e38df2105cdc242bd6af025f8c7a0808b36e6d7a WatchSource:0}: Error finding container 1e9145bb6047ec9f67313797e38df2105cdc242bd6af025f8c7a0808b36e6d7a: Status 404 returned error can't find the container with id 1e9145bb6047ec9f67313797e38df2105cdc242bd6af025f8c7a0808b36e6d7a Dec 05 20:19:20 crc kubenswrapper[4885]: I1205 20:19:20.230198 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh" event={"ID":"8263fedc-0c2a-4de8-8d5c-47aa32b745ee","Type":"ContainerStarted","Data":"1e9145bb6047ec9f67313797e38df2105cdc242bd6af025f8c7a0808b36e6d7a"} Dec 05 20:19:20 crc kubenswrapper[4885]: I1205 20:19:20.231005 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7" event={"ID":"dd4c62d1-80af-4d61-bc04-6ac5c8259121","Type":"ContainerStarted","Data":"52b11e4c67bbae1074add61ac81bce8296c702323c49231b8ed762db26de0d53"} Dec 05 20:19:23 crc kubenswrapper[4885]: I1205 20:19:23.222853 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:23 crc kubenswrapper[4885]: I1205 20:19:23.272313 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:23 crc kubenswrapper[4885]: I1205 20:19:23.454442 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xjr7f"] Dec 05 20:19:24 crc kubenswrapper[4885]: I1205 20:19:24.262045 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xjr7f" podUID="44bfe762-5b1a-4054-8a4d-deb19abb3e2b" containerName="registry-server" containerID="cri-o://7328bd7ee5f929175127d4afd910a683c9a388d704689469a2ffa5cecd9fbe39" gracePeriod=2 Dec 05 20:19:25 crc kubenswrapper[4885]: I1205 20:19:25.269292 4885 generic.go:334] "Generic (PLEG): container finished" podID="44bfe762-5b1a-4054-8a4d-deb19abb3e2b" containerID="7328bd7ee5f929175127d4afd910a683c9a388d704689469a2ffa5cecd9fbe39" exitCode=0 Dec 05 20:19:25 crc kubenswrapper[4885]: I1205 20:19:25.269336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjr7f" event={"ID":"44bfe762-5b1a-4054-8a4d-deb19abb3e2b","Type":"ContainerDied","Data":"7328bd7ee5f929175127d4afd910a683c9a388d704689469a2ffa5cecd9fbe39"} Dec 05 20:19:26 crc kubenswrapper[4885]: I1205 20:19:26.327529 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:26 crc kubenswrapper[4885]: I1205 20:19:26.504593 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-utilities\") pod \"44bfe762-5b1a-4054-8a4d-deb19abb3e2b\" (UID: \"44bfe762-5b1a-4054-8a4d-deb19abb3e2b\") " Dec 05 20:19:26 crc kubenswrapper[4885]: I1205 20:19:26.504669 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-catalog-content\") pod \"44bfe762-5b1a-4054-8a4d-deb19abb3e2b\" (UID: \"44bfe762-5b1a-4054-8a4d-deb19abb3e2b\") " Dec 05 20:19:26 crc kubenswrapper[4885]: I1205 20:19:26.504717 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j68l7\" (UniqueName: \"kubernetes.io/projected/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-kube-api-access-j68l7\") pod \"44bfe762-5b1a-4054-8a4d-deb19abb3e2b\" (UID: \"44bfe762-5b1a-4054-8a4d-deb19abb3e2b\") " Dec 05 20:19:26 crc kubenswrapper[4885]: I1205 20:19:26.505469 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-utilities" (OuterVolumeSpecName: "utilities") pod "44bfe762-5b1a-4054-8a4d-deb19abb3e2b" (UID: "44bfe762-5b1a-4054-8a4d-deb19abb3e2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:19:26 crc kubenswrapper[4885]: I1205 20:19:26.512762 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-kube-api-access-j68l7" (OuterVolumeSpecName: "kube-api-access-j68l7") pod "44bfe762-5b1a-4054-8a4d-deb19abb3e2b" (UID: "44bfe762-5b1a-4054-8a4d-deb19abb3e2b"). InnerVolumeSpecName "kube-api-access-j68l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:19:26 crc kubenswrapper[4885]: I1205 20:19:26.605875 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44bfe762-5b1a-4054-8a4d-deb19abb3e2b" (UID: "44bfe762-5b1a-4054-8a4d-deb19abb3e2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:19:26 crc kubenswrapper[4885]: I1205 20:19:26.606834 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:19:26 crc kubenswrapper[4885]: I1205 20:19:26.606937 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:19:26 crc kubenswrapper[4885]: I1205 20:19:26.606968 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j68l7\" (UniqueName: \"kubernetes.io/projected/44bfe762-5b1a-4054-8a4d-deb19abb3e2b-kube-api-access-j68l7\") on node \"crc\" DevicePath \"\"" Dec 05 20:19:27 crc kubenswrapper[4885]: I1205 20:19:27.285418 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7" event={"ID":"dd4c62d1-80af-4d61-bc04-6ac5c8259121","Type":"ContainerStarted","Data":"8b311e4ee1a3aeef99ab1309afdcc009d10ebcbf5bf239703e1e703d0d1fb5c6"} Dec 05 20:19:27 crc kubenswrapper[4885]: I1205 20:19:27.285722 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7" Dec 05 20:19:27 crc kubenswrapper[4885]: I1205 20:19:27.289576 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjr7f" event={"ID":"44bfe762-5b1a-4054-8a4d-deb19abb3e2b","Type":"ContainerDied","Data":"00109434d972c6dd18041b04000e02f1abca4cdb57b434aca9525f87b178d0de"} Dec 05 20:19:27 crc kubenswrapper[4885]: I1205 20:19:27.289619 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xjr7f" Dec 05 20:19:27 crc kubenswrapper[4885]: I1205 20:19:27.289627 4885 scope.go:117] "RemoveContainer" containerID="7328bd7ee5f929175127d4afd910a683c9a388d704689469a2ffa5cecd9fbe39" Dec 05 20:19:27 crc kubenswrapper[4885]: I1205 20:19:27.292033 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh" event={"ID":"8263fedc-0c2a-4de8-8d5c-47aa32b745ee","Type":"ContainerStarted","Data":"b6aa076b190b1a41000098cd674be70ab456e67ba32f3cf2f50faad8aad4b90d"} Dec 05 20:19:27 crc kubenswrapper[4885]: I1205 20:19:27.292236 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh" Dec 05 20:19:27 crc kubenswrapper[4885]: I1205 20:19:27.314216 4885 scope.go:117] "RemoveContainer" containerID="4467ed87b913cfc95145ca5d5ea8dbe2ab0e97ca755409dbfded9e1cf6c13b46" Dec 05 20:19:27 crc kubenswrapper[4885]: I1205 20:19:27.316915 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7" podStartSLOduration=2.282776052 podStartE2EDuration="8.31689443s" podCreationTimestamp="2025-12-05 20:19:19 +0000 UTC" firstStartedPulling="2025-12-05 20:19:20.093302605 +0000 UTC m=+825.390118266" lastFinishedPulling="2025-12-05 20:19:26.127420983 +0000 UTC m=+831.424236644" observedRunningTime="2025-12-05 20:19:27.31179795 +0000 UTC m=+832.608613631" watchObservedRunningTime="2025-12-05 20:19:27.31689443 +0000 UTC m=+832.613710101" Dec 05 20:19:27 crc kubenswrapper[4885]: I1205 20:19:27.334361 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xjr7f"] Dec 05 20:19:27 crc kubenswrapper[4885]: I1205 20:19:27.339175 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xjr7f"] Dec 05 20:19:27 crc kubenswrapper[4885]: I1205 20:19:27.344074 4885 scope.go:117] "RemoveContainer" containerID="4403d51b0bb884246d666831ee867e744459d0d0fee49ef3fbf72164bdda0942" Dec 05 20:19:27 crc kubenswrapper[4885]: I1205 20:19:27.365392 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh" podStartSLOduration=2.368471604 podStartE2EDuration="8.365370063s" podCreationTimestamp="2025-12-05 20:19:19 +0000 UTC" firstStartedPulling="2025-12-05 20:19:20.144601147 +0000 UTC m=+825.441416808" lastFinishedPulling="2025-12-05 20:19:26.141499606 +0000 UTC m=+831.438315267" observedRunningTime="2025-12-05 20:19:27.358867209 +0000 UTC m=+832.655682880" watchObservedRunningTime="2025-12-05 20:19:27.365370063 +0000 UTC m=+832.662185734" Dec 05 20:19:29 crc kubenswrapper[4885]: I1205 20:19:29.180743 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44bfe762-5b1a-4054-8a4d-deb19abb3e2b" path="/var/lib/kubelet/pods/44bfe762-5b1a-4054-8a4d-deb19abb3e2b/volumes" Dec 05 20:19:39 crc kubenswrapper[4885]: I1205 20:19:39.869091 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5dcf889d57-wtshh" Dec 05 20:19:59 crc kubenswrapper[4885]: I1205 20:19:59.605712 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-fb9f8748-k8dk7" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.334152 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-2qkxh"] Dec 05 20:20:00 crc kubenswrapper[4885]: E1205 20:20:00.334384 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bfe762-5b1a-4054-8a4d-deb19abb3e2b" containerName="registry-server" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.334396 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bfe762-5b1a-4054-8a4d-deb19abb3e2b" containerName="registry-server" Dec 05 20:20:00 crc kubenswrapper[4885]: E1205 20:20:00.334419 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bfe762-5b1a-4054-8a4d-deb19abb3e2b" containerName="extract-utilities" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.334425 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bfe762-5b1a-4054-8a4d-deb19abb3e2b" containerName="extract-utilities" Dec 05 20:20:00 crc kubenswrapper[4885]: E1205 20:20:00.334435 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bfe762-5b1a-4054-8a4d-deb19abb3e2b" containerName="extract-content" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.334442 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bfe762-5b1a-4054-8a4d-deb19abb3e2b" containerName="extract-content" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.334542 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="44bfe762-5b1a-4054-8a4d-deb19abb3e2b" containerName="registry-server" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.336405 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.338986 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.339311 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-5gnhz" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.340437 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.348980 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq"] Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.349853 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.351933 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.370897 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq"] Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.437819 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f0f8b2ce-10b2-491b-9100-34835c07e175-frr-sockets\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.437860 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f0f8b2ce-10b2-491b-9100-34835c07e175-frr-startup\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.437885 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f0f8b2ce-10b2-491b-9100-34835c07e175-metrics\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.438169 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xltq\" (UniqueName: \"kubernetes.io/projected/f0f8b2ce-10b2-491b-9100-34835c07e175-kube-api-access-4xltq\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.438253 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1b920f0-0596-43ef-b94b-d3035f0e5e1c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-p9slq\" (UID: \"a1b920f0-0596-43ef-b94b-d3035f0e5e1c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.438288 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f0f8b2ce-10b2-491b-9100-34835c07e175-reloader\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.438304 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0f8b2ce-10b2-491b-9100-34835c07e175-metrics-certs\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.438337 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wk9k\" (UniqueName: \"kubernetes.io/projected/a1b920f0-0596-43ef-b94b-d3035f0e5e1c-kube-api-access-9wk9k\") pod \"frr-k8s-webhook-server-7fcb986d4-p9slq\" (UID: \"a1b920f0-0596-43ef-b94b-d3035f0e5e1c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.438382 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f0f8b2ce-10b2-491b-9100-34835c07e175-frr-conf\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.519688 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-5jq2d"] Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.520663 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5jq2d" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.522308 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-cckrf" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.522530 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.522675 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.526115 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.539359 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f0f8b2ce-10b2-491b-9100-34835c07e175-metrics\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.539445 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xltq\" (UniqueName: \"kubernetes.io/projected/f0f8b2ce-10b2-491b-9100-34835c07e175-kube-api-access-4xltq\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.539478 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1b920f0-0596-43ef-b94b-d3035f0e5e1c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-p9slq\" (UID: \"a1b920f0-0596-43ef-b94b-d3035f0e5e1c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.539499 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f0f8b2ce-10b2-491b-9100-34835c07e175-reloader\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.539518 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0f8b2ce-10b2-491b-9100-34835c07e175-metrics-certs\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.539542 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wk9k\" (UniqueName: \"kubernetes.io/projected/a1b920f0-0596-43ef-b94b-d3035f0e5e1c-kube-api-access-9wk9k\") pod \"frr-k8s-webhook-server-7fcb986d4-p9slq\" (UID: \"a1b920f0-0596-43ef-b94b-d3035f0e5e1c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.539569 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f0f8b2ce-10b2-491b-9100-34835c07e175-frr-conf\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.539596 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f0f8b2ce-10b2-491b-9100-34835c07e175-frr-sockets\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.539619 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f0f8b2ce-10b2-491b-9100-34835c07e175-frr-startup\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.540620 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f0f8b2ce-10b2-491b-9100-34835c07e175-frr-startup\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.540899 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f0f8b2ce-10b2-491b-9100-34835c07e175-metrics\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: E1205 20:20:00.541232 4885 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 05 20:20:00 crc kubenswrapper[4885]: E1205 20:20:00.541281 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b920f0-0596-43ef-b94b-d3035f0e5e1c-cert podName:a1b920f0-0596-43ef-b94b-d3035f0e5e1c nodeName:}" failed. No retries permitted until 2025-12-05 20:20:01.041265713 +0000 UTC m=+866.338081374 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1b920f0-0596-43ef-b94b-d3035f0e5e1c-cert") pod "frr-k8s-webhook-server-7fcb986d4-p9slq" (UID: "a1b920f0-0596-43ef-b94b-d3035f0e5e1c") : secret "frr-k8s-webhook-server-cert" not found Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.541663 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f0f8b2ce-10b2-491b-9100-34835c07e175-reloader\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: E1205 20:20:00.541735 4885 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 05 20:20:00 crc kubenswrapper[4885]: E1205 20:20:00.541765 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0f8b2ce-10b2-491b-9100-34835c07e175-metrics-certs podName:f0f8b2ce-10b2-491b-9100-34835c07e175 nodeName:}" failed. No retries permitted until 2025-12-05 20:20:01.041755789 +0000 UTC m=+866.338571450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f0f8b2ce-10b2-491b-9100-34835c07e175-metrics-certs") pod "frr-k8s-2qkxh" (UID: "f0f8b2ce-10b2-491b-9100-34835c07e175") : secret "frr-k8s-certs-secret" not found Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.542164 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f0f8b2ce-10b2-491b-9100-34835c07e175-frr-conf\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.542404 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f0f8b2ce-10b2-491b-9100-34835c07e175-frr-sockets\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.549283 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-gwwj5"] Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.550544 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-gwwj5" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.563318 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.564337 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xltq\" (UniqueName: \"kubernetes.io/projected/f0f8b2ce-10b2-491b-9100-34835c07e175-kube-api-access-4xltq\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.570596 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wk9k\" (UniqueName: \"kubernetes.io/projected/a1b920f0-0596-43ef-b94b-d3035f0e5e1c-kube-api-access-9wk9k\") pod \"frr-k8s-webhook-server-7fcb986d4-p9slq\" (UID: \"a1b920f0-0596-43ef-b94b-d3035f0e5e1c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.580265 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-gwwj5"] Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.640500 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61bdac93-6e5a-4b95-a146-ea0874dc5962-cert\") pod \"controller-f8648f98b-gwwj5\" (UID: \"61bdac93-6e5a-4b95-a146-ea0874dc5962\") " pod="metallb-system/controller-f8648f98b-gwwj5" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.640547 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fa36864-508b-488b-8830-d60337213cca-metrics-certs\") pod \"speaker-5jq2d\" (UID: \"2fa36864-508b-488b-8830-d60337213cca\") " pod="metallb-system/speaker-5jq2d" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.640580 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2fa36864-508b-488b-8830-d60337213cca-memberlist\") pod \"speaker-5jq2d\" (UID: \"2fa36864-508b-488b-8830-d60337213cca\") " pod="metallb-system/speaker-5jq2d" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.640639 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nflvj\" (UniqueName: \"kubernetes.io/projected/61bdac93-6e5a-4b95-a146-ea0874dc5962-kube-api-access-nflvj\") pod \"controller-f8648f98b-gwwj5\" (UID: \"61bdac93-6e5a-4b95-a146-ea0874dc5962\") " pod="metallb-system/controller-f8648f98b-gwwj5" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.640702 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2fa36864-508b-488b-8830-d60337213cca-metallb-excludel2\") pod \"speaker-5jq2d\" (UID: \"2fa36864-508b-488b-8830-d60337213cca\") " pod="metallb-system/speaker-5jq2d" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.640793 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61bdac93-6e5a-4b95-a146-ea0874dc5962-metrics-certs\") pod \"controller-f8648f98b-gwwj5\" (UID: \"61bdac93-6e5a-4b95-a146-ea0874dc5962\") " pod="metallb-system/controller-f8648f98b-gwwj5" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.640854 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhrfk\" (UniqueName: \"kubernetes.io/projected/2fa36864-508b-488b-8830-d60337213cca-kube-api-access-rhrfk\") pod \"speaker-5jq2d\" (UID: \"2fa36864-508b-488b-8830-d60337213cca\") " pod="metallb-system/speaker-5jq2d" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.742535 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhrfk\" (UniqueName: \"kubernetes.io/projected/2fa36864-508b-488b-8830-d60337213cca-kube-api-access-rhrfk\") pod \"speaker-5jq2d\" (UID: \"2fa36864-508b-488b-8830-d60337213cca\") " pod="metallb-system/speaker-5jq2d" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.742676 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61bdac93-6e5a-4b95-a146-ea0874dc5962-cert\") pod \"controller-f8648f98b-gwwj5\" (UID: \"61bdac93-6e5a-4b95-a146-ea0874dc5962\") " pod="metallb-system/controller-f8648f98b-gwwj5" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.742706 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fa36864-508b-488b-8830-d60337213cca-metrics-certs\") pod \"speaker-5jq2d\" (UID: \"2fa36864-508b-488b-8830-d60337213cca\") " pod="metallb-system/speaker-5jq2d" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.742747 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2fa36864-508b-488b-8830-d60337213cca-memberlist\") pod \"speaker-5jq2d\" (UID: \"2fa36864-508b-488b-8830-d60337213cca\") " pod="metallb-system/speaker-5jq2d" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.742775 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nflvj\" (UniqueName: \"kubernetes.io/projected/61bdac93-6e5a-4b95-a146-ea0874dc5962-kube-api-access-nflvj\") pod \"controller-f8648f98b-gwwj5\" (UID: \"61bdac93-6e5a-4b95-a146-ea0874dc5962\") " pod="metallb-system/controller-f8648f98b-gwwj5" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.742801 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2fa36864-508b-488b-8830-d60337213cca-metallb-excludel2\") pod \"speaker-5jq2d\" (UID: \"2fa36864-508b-488b-8830-d60337213cca\") " pod="metallb-system/speaker-5jq2d" Dec 05 20:20:00 crc kubenswrapper[4885]: E1205 20:20:00.742891 4885 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 20:20:00 crc kubenswrapper[4885]: E1205 20:20:00.742962 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fa36864-508b-488b-8830-d60337213cca-memberlist podName:2fa36864-508b-488b-8830-d60337213cca nodeName:}" failed. No retries permitted until 2025-12-05 20:20:01.242945357 +0000 UTC m=+866.539761018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2fa36864-508b-488b-8830-d60337213cca-memberlist") pod "speaker-5jq2d" (UID: "2fa36864-508b-488b-8830-d60337213cca") : secret "metallb-memberlist" not found Dec 05 20:20:00 crc kubenswrapper[4885]: E1205 20:20:00.743051 4885 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 05 20:20:00 crc kubenswrapper[4885]: E1205 20:20:00.743105 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61bdac93-6e5a-4b95-a146-ea0874dc5962-metrics-certs podName:61bdac93-6e5a-4b95-a146-ea0874dc5962 nodeName:}" failed. No retries permitted until 2025-12-05 20:20:01.243089872 +0000 UTC m=+866.539905533 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61bdac93-6e5a-4b95-a146-ea0874dc5962-metrics-certs") pod "controller-f8648f98b-gwwj5" (UID: "61bdac93-6e5a-4b95-a146-ea0874dc5962") : secret "controller-certs-secret" not found Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.743372 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61bdac93-6e5a-4b95-a146-ea0874dc5962-metrics-certs\") pod \"controller-f8648f98b-gwwj5\" (UID: \"61bdac93-6e5a-4b95-a146-ea0874dc5962\") " pod="metallb-system/controller-f8648f98b-gwwj5" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.743843 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2fa36864-508b-488b-8830-d60337213cca-metallb-excludel2\") pod \"speaker-5jq2d\" (UID: \"2fa36864-508b-488b-8830-d60337213cca\") " pod="metallb-system/speaker-5jq2d" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.746077 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.746529 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fa36864-508b-488b-8830-d60337213cca-metrics-certs\") pod \"speaker-5jq2d\" (UID: \"2fa36864-508b-488b-8830-d60337213cca\") " pod="metallb-system/speaker-5jq2d" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.756730 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61bdac93-6e5a-4b95-a146-ea0874dc5962-cert\") pod \"controller-f8648f98b-gwwj5\" (UID: \"61bdac93-6e5a-4b95-a146-ea0874dc5962\") " pod="metallb-system/controller-f8648f98b-gwwj5" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.766803 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nflvj\" (UniqueName: \"kubernetes.io/projected/61bdac93-6e5a-4b95-a146-ea0874dc5962-kube-api-access-nflvj\") pod \"controller-f8648f98b-gwwj5\" (UID: \"61bdac93-6e5a-4b95-a146-ea0874dc5962\") " pod="metallb-system/controller-f8648f98b-gwwj5" Dec 05 20:20:00 crc kubenswrapper[4885]: I1205 20:20:00.767567 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhrfk\" (UniqueName: \"kubernetes.io/projected/2fa36864-508b-488b-8830-d60337213cca-kube-api-access-rhrfk\") pod \"speaker-5jq2d\" (UID: \"2fa36864-508b-488b-8830-d60337213cca\") " pod="metallb-system/speaker-5jq2d" Dec 05 20:20:01 crc kubenswrapper[4885]: I1205 20:20:01.046298 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1b920f0-0596-43ef-b94b-d3035f0e5e1c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-p9slq\" (UID: \"a1b920f0-0596-43ef-b94b-d3035f0e5e1c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq" Dec 05 20:20:01 crc kubenswrapper[4885]: I1205 20:20:01.046364 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0f8b2ce-10b2-491b-9100-34835c07e175-metrics-certs\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:01 crc kubenswrapper[4885]: I1205 20:20:01.049870 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0f8b2ce-10b2-491b-9100-34835c07e175-metrics-certs\") pod \"frr-k8s-2qkxh\" (UID: \"f0f8b2ce-10b2-491b-9100-34835c07e175\") " pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:01 crc kubenswrapper[4885]: I1205 20:20:01.050185 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1b920f0-0596-43ef-b94b-d3035f0e5e1c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-p9slq\" (UID: \"a1b920f0-0596-43ef-b94b-d3035f0e5e1c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq" Dec 05 20:20:01 crc kubenswrapper[4885]: I1205 20:20:01.248628 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2fa36864-508b-488b-8830-d60337213cca-memberlist\") pod \"speaker-5jq2d\" (UID: \"2fa36864-508b-488b-8830-d60337213cca\") " pod="metallb-system/speaker-5jq2d" Dec 05 20:20:01 crc kubenswrapper[4885]: E1205 20:20:01.248754 4885 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 20:20:01 crc kubenswrapper[4885]: I1205 20:20:01.248777 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61bdac93-6e5a-4b95-a146-ea0874dc5962-metrics-certs\") pod \"controller-f8648f98b-gwwj5\" (UID: \"61bdac93-6e5a-4b95-a146-ea0874dc5962\") " pod="metallb-system/controller-f8648f98b-gwwj5" Dec 05 20:20:01 crc kubenswrapper[4885]: E1205 20:20:01.248807 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fa36864-508b-488b-8830-d60337213cca-memberlist podName:2fa36864-508b-488b-8830-d60337213cca nodeName:}" failed. No retries permitted until 2025-12-05 20:20:02.248792975 +0000 UTC m=+867.545608636 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2fa36864-508b-488b-8830-d60337213cca-memberlist") pod "speaker-5jq2d" (UID: "2fa36864-508b-488b-8830-d60337213cca") : secret "metallb-memberlist" not found Dec 05 20:20:01 crc kubenswrapper[4885]: I1205 20:20:01.253836 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61bdac93-6e5a-4b95-a146-ea0874dc5962-metrics-certs\") pod \"controller-f8648f98b-gwwj5\" (UID: \"61bdac93-6e5a-4b95-a146-ea0874dc5962\") " pod="metallb-system/controller-f8648f98b-gwwj5" Dec 05 20:20:01 crc kubenswrapper[4885]: I1205 20:20:01.271061 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:01 crc kubenswrapper[4885]: I1205 20:20:01.278843 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq" Dec 05 20:20:01 crc kubenswrapper[4885]: I1205 20:20:01.464699 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-gwwj5" Dec 05 20:20:01 crc kubenswrapper[4885]: I1205 20:20:01.485405 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qkxh" event={"ID":"f0f8b2ce-10b2-491b-9100-34835c07e175","Type":"ContainerStarted","Data":"d4ff5cbc5e7907daf09e9f71d6195b258e2e939e08de4cb7eef5479bd1c634d3"} Dec 05 20:20:01 crc kubenswrapper[4885]: I1205 20:20:01.519923 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq"] Dec 05 20:20:01 crc kubenswrapper[4885]: W1205 20:20:01.535415 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1b920f0_0596_43ef_b94b_d3035f0e5e1c.slice/crio-d8afab7d08afc4f34ecca6edab7a6190ebec12b43329216e00561524d0b3bed8 WatchSource:0}: Error finding container d8afab7d08afc4f34ecca6edab7a6190ebec12b43329216e00561524d0b3bed8: Status 404 returned error can't find the container with id d8afab7d08afc4f34ecca6edab7a6190ebec12b43329216e00561524d0b3bed8 Dec 05 20:20:01 crc kubenswrapper[4885]: I1205 20:20:01.849671 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-gwwj5"] Dec 05 20:20:02 crc kubenswrapper[4885]: I1205 20:20:02.262715 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2fa36864-508b-488b-8830-d60337213cca-memberlist\") pod \"speaker-5jq2d\" (UID: \"2fa36864-508b-488b-8830-d60337213cca\") " pod="metallb-system/speaker-5jq2d" Dec 05 20:20:02 crc kubenswrapper[4885]: I1205 20:20:02.280589 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2fa36864-508b-488b-8830-d60337213cca-memberlist\") pod \"speaker-5jq2d\" (UID: \"2fa36864-508b-488b-8830-d60337213cca\") " pod="metallb-system/speaker-5jq2d" Dec 05 20:20:02 crc kubenswrapper[4885]: I1205 20:20:02.334168 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5jq2d" Dec 05 20:20:02 crc kubenswrapper[4885]: W1205 20:20:02.354300 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fa36864_508b_488b_8830_d60337213cca.slice/crio-324047e2d5266c0a578e5e0b4f9abbc25ec5220d76b0c1528b6c93aabfab3768 WatchSource:0}: Error finding container 324047e2d5266c0a578e5e0b4f9abbc25ec5220d76b0c1528b6c93aabfab3768: Status 404 returned error can't find the container with id 324047e2d5266c0a578e5e0b4f9abbc25ec5220d76b0c1528b6c93aabfab3768 Dec 05 20:20:02 crc kubenswrapper[4885]: I1205 20:20:02.497073 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq" event={"ID":"a1b920f0-0596-43ef-b94b-d3035f0e5e1c","Type":"ContainerStarted","Data":"d8afab7d08afc4f34ecca6edab7a6190ebec12b43329216e00561524d0b3bed8"} Dec 05 20:20:02 crc kubenswrapper[4885]: I1205 20:20:02.502447 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-gwwj5" event={"ID":"61bdac93-6e5a-4b95-a146-ea0874dc5962","Type":"ContainerStarted","Data":"6168fe249ab714fd09781dd2b6a96e091b744bce5d18eaa361a3576c3aa93261"} Dec 05 20:20:02 crc kubenswrapper[4885]: I1205 20:20:02.502484 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-gwwj5" event={"ID":"61bdac93-6e5a-4b95-a146-ea0874dc5962","Type":"ContainerStarted","Data":"fef22e0fe7aeee94a2f9b868595a801a9f8f3a72a4314c9877b19d954122b514"} Dec 05 20:20:02 crc kubenswrapper[4885]: I1205 20:20:02.502494 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-gwwj5" event={"ID":"61bdac93-6e5a-4b95-a146-ea0874dc5962","Type":"ContainerStarted","Data":"0d898a800bd0d022b31efee9f71f6d3a0712ba4a40609caafa283322326d57fe"} Dec 05 20:20:02 crc kubenswrapper[4885]: I1205 20:20:02.503061 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-gwwj5" Dec 05 20:20:02 crc kubenswrapper[4885]: I1205 20:20:02.511534 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5jq2d" event={"ID":"2fa36864-508b-488b-8830-d60337213cca","Type":"ContainerStarted","Data":"324047e2d5266c0a578e5e0b4f9abbc25ec5220d76b0c1528b6c93aabfab3768"} Dec 05 20:20:02 crc kubenswrapper[4885]: I1205 20:20:02.541957 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-gwwj5" podStartSLOduration=2.541935557 podStartE2EDuration="2.541935557s" podCreationTimestamp="2025-12-05 20:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:20:02.538740565 +0000 UTC m=+867.835556226" watchObservedRunningTime="2025-12-05 20:20:02.541935557 +0000 UTC m=+867.838751218" Dec 05 20:20:03 crc kubenswrapper[4885]: I1205 20:20:03.527662 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5jq2d" event={"ID":"2fa36864-508b-488b-8830-d60337213cca","Type":"ContainerStarted","Data":"0599d62799c2a49eedf0207ad3e46f1506946c05f3c6a31f9884f6cdc782ba46"} Dec 05 20:20:03 crc kubenswrapper[4885]: I1205 20:20:03.527713 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5jq2d" event={"ID":"2fa36864-508b-488b-8830-d60337213cca","Type":"ContainerStarted","Data":"3f78d57d2df4a48563f7ee59c65a4a05bfaa0ac9f959aee72cb5bf0f14dbbf83"} Dec 05 20:20:03 crc kubenswrapper[4885]: I1205 20:20:03.551968 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-5jq2d" podStartSLOduration=3.5519507089999998 podStartE2EDuration="3.551950709s" podCreationTimestamp="2025-12-05 20:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:20:03.545845484 +0000 UTC m=+868.842661155" watchObservedRunningTime="2025-12-05 20:20:03.551950709 +0000 UTC m=+868.848766390" Dec 05 20:20:04 crc kubenswrapper[4885]: I1205 20:20:04.532400 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-5jq2d" Dec 05 20:20:09 crc kubenswrapper[4885]: I1205 20:20:09.577186 4885 generic.go:334] "Generic (PLEG): container finished" podID="f0f8b2ce-10b2-491b-9100-34835c07e175" containerID="9959156b421104330dee7b4c221df091b4c09b53e1ded689aa7e1d0b03ffad87" exitCode=0 Dec 05 20:20:09 crc kubenswrapper[4885]: I1205 20:20:09.577892 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qkxh" event={"ID":"f0f8b2ce-10b2-491b-9100-34835c07e175","Type":"ContainerDied","Data":"9959156b421104330dee7b4c221df091b4c09b53e1ded689aa7e1d0b03ffad87"} Dec 05 20:20:09 crc kubenswrapper[4885]: I1205 20:20:09.584153 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq" event={"ID":"a1b920f0-0596-43ef-b94b-d3035f0e5e1c","Type":"ContainerStarted","Data":"2f9cd2bd76f1b875fad6496f485d6e602f3d3dffdecf222beadbf52d8af60405"} Dec 05 20:20:09 crc kubenswrapper[4885]: I1205 20:20:09.584304 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq" Dec 05 20:20:09 crc kubenswrapper[4885]: I1205 20:20:09.632649 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq" podStartSLOduration=2.775790711 podStartE2EDuration="9.6326278s" podCreationTimestamp="2025-12-05 20:20:00 +0000 UTC" firstStartedPulling="2025-12-05 20:20:01.539280691 +0000 UTC m=+866.836096352" lastFinishedPulling="2025-12-05 20:20:08.39611778 +0000 UTC m=+873.692933441" observedRunningTime="2025-12-05 20:20:09.627132714 +0000 UTC m=+874.923948375" watchObservedRunningTime="2025-12-05 20:20:09.6326278 +0000 UTC m=+874.929443461" Dec 05 20:20:10 crc kubenswrapper[4885]: I1205 20:20:10.591072 4885 generic.go:334] "Generic (PLEG): container finished" podID="f0f8b2ce-10b2-491b-9100-34835c07e175" containerID="97063d5c3b9809ac2668f51ce5b9f5014526d9d12c2fea6a8a77fa95b95c2a53" exitCode=0 Dec 05 20:20:10 crc kubenswrapper[4885]: I1205 20:20:10.591472 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qkxh" event={"ID":"f0f8b2ce-10b2-491b-9100-34835c07e175","Type":"ContainerDied","Data":"97063d5c3b9809ac2668f51ce5b9f5014526d9d12c2fea6a8a77fa95b95c2a53"} Dec 05 20:20:11 crc kubenswrapper[4885]: I1205 20:20:11.602730 4885 generic.go:334] "Generic (PLEG): container finished" podID="f0f8b2ce-10b2-491b-9100-34835c07e175" containerID="9dc4b476a0c6e5795a2a5c3ddbce82e0f88b077dcd562f2b1349ab9a66f02ea1" exitCode=0 Dec 05 20:20:11 crc kubenswrapper[4885]: I1205 20:20:11.602817 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qkxh" event={"ID":"f0f8b2ce-10b2-491b-9100-34835c07e175","Type":"ContainerDied","Data":"9dc4b476a0c6e5795a2a5c3ddbce82e0f88b077dcd562f2b1349ab9a66f02ea1"} Dec 05 20:20:12 crc kubenswrapper[4885]: I1205 20:20:12.340385 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-5jq2d" Dec 05 20:20:12 crc kubenswrapper[4885]: I1205 20:20:12.639554 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qkxh" event={"ID":"f0f8b2ce-10b2-491b-9100-34835c07e175","Type":"ContainerStarted","Data":"302e2a3785476a06047a6b9e11c66294f11adfaf0bc0f052ae6c270f269f6a0c"} Dec 05 20:20:12 crc kubenswrapper[4885]: I1205 20:20:12.639879 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qkxh" event={"ID":"f0f8b2ce-10b2-491b-9100-34835c07e175","Type":"ContainerStarted","Data":"d35d970801e531dadd0294aeee9b819d544558099fe14280d49fddf17b544845"} Dec 05 20:20:12 crc kubenswrapper[4885]: I1205 20:20:12.639891 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qkxh" event={"ID":"f0f8b2ce-10b2-491b-9100-34835c07e175","Type":"ContainerStarted","Data":"486fbb6f107068136e3f8895e9e35574babb1f98dce380e3823962d4a2fce3aa"} Dec 05 20:20:12 crc kubenswrapper[4885]: I1205 20:20:12.639903 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qkxh" event={"ID":"f0f8b2ce-10b2-491b-9100-34835c07e175","Type":"ContainerStarted","Data":"998aae23aebdab26452a832aa67b226085f61125b2bd093aa329a0a4103872bc"} Dec 05 20:20:12 crc kubenswrapper[4885]: I1205 20:20:12.639913 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qkxh" event={"ID":"f0f8b2ce-10b2-491b-9100-34835c07e175","Type":"ContainerStarted","Data":"3714e6be3a57d260c240b622ca78eb5c9a48b53b6ed9b1407308b69b8d16fd88"} Dec 05 20:20:13 crc kubenswrapper[4885]: I1205 20:20:13.651969 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qkxh" event={"ID":"f0f8b2ce-10b2-491b-9100-34835c07e175","Type":"ContainerStarted","Data":"a0be609812818c56c3ea737b9edf73a6ab83df49f18f70523fccd82ac292ff21"} Dec 05 20:20:13 crc kubenswrapper[4885]: I1205 20:20:13.652164 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:13 crc kubenswrapper[4885]: I1205 20:20:13.677264 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-2qkxh" podStartSLOduration=6.695114537 podStartE2EDuration="13.677248875s" podCreationTimestamp="2025-12-05 20:20:00 +0000 UTC" firstStartedPulling="2025-12-05 20:20:01.415195771 +0000 UTC m=+866.712011432" lastFinishedPulling="2025-12-05 20:20:08.397330109 +0000 UTC m=+873.694145770" observedRunningTime="2025-12-05 20:20:13.673230696 +0000 UTC m=+878.970046377" watchObservedRunningTime="2025-12-05 20:20:13.677248875 +0000 UTC m=+878.974064536" Dec 05 20:20:15 crc kubenswrapper[4885]: I1205 20:20:15.436841 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-625x7"] Dec 05 20:20:15 crc kubenswrapper[4885]: I1205 20:20:15.437977 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-625x7" Dec 05 20:20:15 crc kubenswrapper[4885]: I1205 20:20:15.441078 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 05 20:20:15 crc kubenswrapper[4885]: I1205 20:20:15.441728 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 05 20:20:15 crc kubenswrapper[4885]: I1205 20:20:15.447798 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-cg2x9" Dec 05 20:20:15 crc kubenswrapper[4885]: I1205 20:20:15.449739 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-625x7"] Dec 05 20:20:15 crc kubenswrapper[4885]: I1205 20:20:15.539416 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jv2w\" (UniqueName: \"kubernetes.io/projected/b1a6f8bf-f3ef-4c6a-8906-4759feaffc65-kube-api-access-7jv2w\") pod \"openstack-operator-index-625x7\" (UID: \"b1a6f8bf-f3ef-4c6a-8906-4759feaffc65\") " pod="openstack-operators/openstack-operator-index-625x7" Dec 05 20:20:15 crc kubenswrapper[4885]: I1205 20:20:15.640216 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jv2w\" (UniqueName: \"kubernetes.io/projected/b1a6f8bf-f3ef-4c6a-8906-4759feaffc65-kube-api-access-7jv2w\") pod \"openstack-operator-index-625x7\" (UID: \"b1a6f8bf-f3ef-4c6a-8906-4759feaffc65\") " pod="openstack-operators/openstack-operator-index-625x7" Dec 05 20:20:15 crc kubenswrapper[4885]: I1205 20:20:15.672098 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jv2w\" (UniqueName: \"kubernetes.io/projected/b1a6f8bf-f3ef-4c6a-8906-4759feaffc65-kube-api-access-7jv2w\") pod \"openstack-operator-index-625x7\" (UID: \"b1a6f8bf-f3ef-4c6a-8906-4759feaffc65\") " pod="openstack-operators/openstack-operator-index-625x7" Dec 05 20:20:15 crc kubenswrapper[4885]: I1205 20:20:15.754451 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-625x7" Dec 05 20:20:16 crc kubenswrapper[4885]: I1205 20:20:16.173904 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-625x7"] Dec 05 20:20:16 crc kubenswrapper[4885]: W1205 20:20:16.177809 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1a6f8bf_f3ef_4c6a_8906_4759feaffc65.slice/crio-ce109bd7e8f36b2a330858ca7e9fd6d2710f9973cd876477e586dcaa5fc1e3b4 WatchSource:0}: Error finding container ce109bd7e8f36b2a330858ca7e9fd6d2710f9973cd876477e586dcaa5fc1e3b4: Status 404 returned error can't find the container with id ce109bd7e8f36b2a330858ca7e9fd6d2710f9973cd876477e586dcaa5fc1e3b4 Dec 05 20:20:16 crc kubenswrapper[4885]: I1205 20:20:16.271849 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:16 crc kubenswrapper[4885]: I1205 20:20:16.308285 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:17 crc kubenswrapper[4885]: I1205 20:20:16.681495 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-625x7" event={"ID":"b1a6f8bf-f3ef-4c6a-8906-4759feaffc65","Type":"ContainerStarted","Data":"ce109bd7e8f36b2a330858ca7e9fd6d2710f9973cd876477e586dcaa5fc1e3b4"} Dec 05 20:20:18 crc kubenswrapper[4885]: I1205 20:20:18.703980 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-625x7" event={"ID":"b1a6f8bf-f3ef-4c6a-8906-4759feaffc65","Type":"ContainerStarted","Data":"29157b8b781b25697b77d7abcdcdcf852fb861e1dd6b2444b5f984e8b46c3e6a"} Dec 05 20:20:18 crc kubenswrapper[4885]: I1205 20:20:18.732948 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-625x7" podStartSLOduration=1.806383901 podStartE2EDuration="3.732920173s" podCreationTimestamp="2025-12-05 20:20:15 +0000 UTC" firstStartedPulling="2025-12-05 20:20:16.181326318 +0000 UTC m=+881.478141979" lastFinishedPulling="2025-12-05 20:20:18.10786259 +0000 UTC m=+883.404678251" observedRunningTime="2025-12-05 20:20:18.724052979 +0000 UTC m=+884.020868680" watchObservedRunningTime="2025-12-05 20:20:18.732920173 +0000 UTC m=+884.029735874" Dec 05 20:20:18 crc kubenswrapper[4885]: I1205 20:20:18.815428 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-625x7"] Dec 05 20:20:19 crc kubenswrapper[4885]: I1205 20:20:19.420122 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jm7lc"] Dec 05 20:20:19 crc kubenswrapper[4885]: I1205 20:20:19.421337 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jm7lc" Dec 05 20:20:19 crc kubenswrapper[4885]: I1205 20:20:19.434726 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jm7lc"] Dec 05 20:20:19 crc kubenswrapper[4885]: I1205 20:20:19.489133 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6trn7\" (UniqueName: \"kubernetes.io/projected/0f1ef804-3daa-44e0-a978-f6edc8efab00-kube-api-access-6trn7\") pod \"openstack-operator-index-jm7lc\" (UID: \"0f1ef804-3daa-44e0-a978-f6edc8efab00\") " pod="openstack-operators/openstack-operator-index-jm7lc" Dec 05 20:20:19 crc kubenswrapper[4885]: I1205 20:20:19.590926 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6trn7\" (UniqueName: \"kubernetes.io/projected/0f1ef804-3daa-44e0-a978-f6edc8efab00-kube-api-access-6trn7\") pod \"openstack-operator-index-jm7lc\" (UID: \"0f1ef804-3daa-44e0-a978-f6edc8efab00\") " pod="openstack-operators/openstack-operator-index-jm7lc" Dec 05 20:20:19 crc kubenswrapper[4885]: I1205 20:20:19.615635 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6trn7\" (UniqueName: \"kubernetes.io/projected/0f1ef804-3daa-44e0-a978-f6edc8efab00-kube-api-access-6trn7\") pod \"openstack-operator-index-jm7lc\" (UID: \"0f1ef804-3daa-44e0-a978-f6edc8efab00\") " pod="openstack-operators/openstack-operator-index-jm7lc" Dec 05 20:20:19 crc kubenswrapper[4885]: I1205 20:20:19.744567 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jm7lc" Dec 05 20:20:19 crc kubenswrapper[4885]: I1205 20:20:19.970321 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jm7lc"] Dec 05 20:20:20 crc kubenswrapper[4885]: I1205 20:20:20.719828 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jm7lc" event={"ID":"0f1ef804-3daa-44e0-a978-f6edc8efab00","Type":"ContainerStarted","Data":"ebf868442331b5a06a1fb977c5f69c03528ed42b451afa804d5318d356a61b78"} Dec 05 20:20:20 crc kubenswrapper[4885]: I1205 20:20:20.719862 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-625x7" podUID="b1a6f8bf-f3ef-4c6a-8906-4759feaffc65" containerName="registry-server" containerID="cri-o://29157b8b781b25697b77d7abcdcdcf852fb861e1dd6b2444b5f984e8b46c3e6a" gracePeriod=2 Dec 05 20:20:20 crc kubenswrapper[4885]: I1205 20:20:20.719907 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jm7lc" event={"ID":"0f1ef804-3daa-44e0-a978-f6edc8efab00","Type":"ContainerStarted","Data":"e70dbec76fc16715a87bd99c2aa3fcdac5ba375552b7851f5c7d23f43ca7a14b"} Dec 05 20:20:20 crc kubenswrapper[4885]: I1205 20:20:20.746658 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jm7lc" podStartSLOduration=1.291602783 podStartE2EDuration="1.746632664s" podCreationTimestamp="2025-12-05 20:20:19 +0000 UTC" firstStartedPulling="2025-12-05 20:20:19.978661639 +0000 UTC m=+885.275477300" lastFinishedPulling="2025-12-05 20:20:20.43369148 +0000 UTC m=+885.730507181" observedRunningTime="2025-12-05 20:20:20.740772587 +0000 UTC m=+886.037588288" watchObservedRunningTime="2025-12-05 20:20:20.746632664 +0000 UTC m=+886.043448325" Dec 05 20:20:21 crc kubenswrapper[4885]: I1205 20:20:21.093698 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-625x7" Dec 05 20:20:21 crc kubenswrapper[4885]: I1205 20:20:21.215337 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jv2w\" (UniqueName: \"kubernetes.io/projected/b1a6f8bf-f3ef-4c6a-8906-4759feaffc65-kube-api-access-7jv2w\") pod \"b1a6f8bf-f3ef-4c6a-8906-4759feaffc65\" (UID: \"b1a6f8bf-f3ef-4c6a-8906-4759feaffc65\") " Dec 05 20:20:21 crc kubenswrapper[4885]: I1205 20:20:21.219988 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a6f8bf-f3ef-4c6a-8906-4759feaffc65-kube-api-access-7jv2w" (OuterVolumeSpecName: "kube-api-access-7jv2w") pod "b1a6f8bf-f3ef-4c6a-8906-4759feaffc65" (UID: "b1a6f8bf-f3ef-4c6a-8906-4759feaffc65"). InnerVolumeSpecName "kube-api-access-7jv2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:20:21 crc kubenswrapper[4885]: I1205 20:20:21.275402 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-2qkxh" Dec 05 20:20:21 crc kubenswrapper[4885]: I1205 20:20:21.283313 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p9slq" Dec 05 20:20:21 crc kubenswrapper[4885]: I1205 20:20:21.316369 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jv2w\" (UniqueName: \"kubernetes.io/projected/b1a6f8bf-f3ef-4c6a-8906-4759feaffc65-kube-api-access-7jv2w\") on node \"crc\" DevicePath \"\"" Dec 05 20:20:21 crc kubenswrapper[4885]: I1205 20:20:21.468850 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-gwwj5" Dec 05 20:20:21 crc kubenswrapper[4885]: I1205 20:20:21.729524 4885 generic.go:334] "Generic (PLEG): container finished" podID="b1a6f8bf-f3ef-4c6a-8906-4759feaffc65" containerID="29157b8b781b25697b77d7abcdcdcf852fb861e1dd6b2444b5f984e8b46c3e6a" exitCode=0 Dec 05 20:20:21 crc kubenswrapper[4885]: I1205 20:20:21.729593 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-625x7" Dec 05 20:20:21 crc kubenswrapper[4885]: I1205 20:20:21.729611 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-625x7" event={"ID":"b1a6f8bf-f3ef-4c6a-8906-4759feaffc65","Type":"ContainerDied","Data":"29157b8b781b25697b77d7abcdcdcf852fb861e1dd6b2444b5f984e8b46c3e6a"} Dec 05 20:20:21 crc kubenswrapper[4885]: I1205 20:20:21.729646 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-625x7" event={"ID":"b1a6f8bf-f3ef-4c6a-8906-4759feaffc65","Type":"ContainerDied","Data":"ce109bd7e8f36b2a330858ca7e9fd6d2710f9973cd876477e586dcaa5fc1e3b4"} Dec 05 20:20:21 crc kubenswrapper[4885]: I1205 20:20:21.729664 4885 scope.go:117] "RemoveContainer" containerID="29157b8b781b25697b77d7abcdcdcf852fb861e1dd6b2444b5f984e8b46c3e6a" Dec 05 20:20:21 crc kubenswrapper[4885]: I1205 20:20:21.756354 4885 scope.go:117] "RemoveContainer" containerID="29157b8b781b25697b77d7abcdcdcf852fb861e1dd6b2444b5f984e8b46c3e6a" Dec 05 20:20:21 crc kubenswrapper[4885]: E1205 20:20:21.758573 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29157b8b781b25697b77d7abcdcdcf852fb861e1dd6b2444b5f984e8b46c3e6a\": container with ID starting with 29157b8b781b25697b77d7abcdcdcf852fb861e1dd6b2444b5f984e8b46c3e6a not found: ID does not exist" containerID="29157b8b781b25697b77d7abcdcdcf852fb861e1dd6b2444b5f984e8b46c3e6a" Dec 05 20:20:21 crc kubenswrapper[4885]: I1205 20:20:21.758654 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29157b8b781b25697b77d7abcdcdcf852fb861e1dd6b2444b5f984e8b46c3e6a"} err="failed to get container status \"29157b8b781b25697b77d7abcdcdcf852fb861e1dd6b2444b5f984e8b46c3e6a\": rpc error: code = NotFound desc = could not find container \"29157b8b781b25697b77d7abcdcdcf852fb861e1dd6b2444b5f984e8b46c3e6a\": container with ID starting with 29157b8b781b25697b77d7abcdcdcf852fb861e1dd6b2444b5f984e8b46c3e6a not found: ID does not exist" Dec 05 20:20:21 crc kubenswrapper[4885]: I1205 20:20:21.763353 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-625x7"] Dec 05 20:20:21 crc kubenswrapper[4885]: I1205 20:20:21.767185 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-625x7"] Dec 05 20:20:23 crc kubenswrapper[4885]: I1205 20:20:23.183310 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1a6f8bf-f3ef-4c6a-8906-4759feaffc65" path="/var/lib/kubelet/pods/b1a6f8bf-f3ef-4c6a-8906-4759feaffc65/volumes" Dec 05 20:20:29 crc kubenswrapper[4885]: I1205 20:20:29.744868 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-jm7lc" Dec 05 20:20:29 crc kubenswrapper[4885]: I1205 20:20:29.745476 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-jm7lc" Dec 05 20:20:29 crc kubenswrapper[4885]: I1205 20:20:29.789044 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-jm7lc" Dec 05 20:20:30 crc kubenswrapper[4885]: I1205 20:20:30.847088 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-jm7lc" Dec 05 20:20:32 crc kubenswrapper[4885]: I1205 20:20:32.254619 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c"] Dec 05 20:20:32 crc kubenswrapper[4885]: E1205 20:20:32.254842 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a6f8bf-f3ef-4c6a-8906-4759feaffc65" containerName="registry-server" Dec 05 20:20:32 crc kubenswrapper[4885]: I1205 20:20:32.254853 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a6f8bf-f3ef-4c6a-8906-4759feaffc65" containerName="registry-server" Dec 05 20:20:32 crc kubenswrapper[4885]: I1205 20:20:32.254963 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a6f8bf-f3ef-4c6a-8906-4759feaffc65" containerName="registry-server" Dec 05 20:20:32 crc kubenswrapper[4885]: I1205 20:20:32.255791 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" Dec 05 20:20:32 crc kubenswrapper[4885]: I1205 20:20:32.257699 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-kv6rf" Dec 05 20:20:32 crc kubenswrapper[4885]: I1205 20:20:32.270935 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c"] Dec 05 20:20:32 crc kubenswrapper[4885]: I1205 20:20:32.362212 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgs45\" (UniqueName: \"kubernetes.io/projected/9339b513-f7aa-4ad6-9e87-b585e81c0577-kube-api-access-qgs45\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c\" (UID: \"9339b513-f7aa-4ad6-9e87-b585e81c0577\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" Dec 05 20:20:32 crc kubenswrapper[4885]: I1205 20:20:32.362407 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9339b513-f7aa-4ad6-9e87-b585e81c0577-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c\" (UID: \"9339b513-f7aa-4ad6-9e87-b585e81c0577\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" Dec 05 20:20:32 crc kubenswrapper[4885]: I1205 20:20:32.362585 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9339b513-f7aa-4ad6-9e87-b585e81c0577-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c\" (UID: \"9339b513-f7aa-4ad6-9e87-b585e81c0577\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" Dec 05 20:20:32 crc kubenswrapper[4885]: I1205 20:20:32.463199 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgs45\" (UniqueName: \"kubernetes.io/projected/9339b513-f7aa-4ad6-9e87-b585e81c0577-kube-api-access-qgs45\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c\" (UID: \"9339b513-f7aa-4ad6-9e87-b585e81c0577\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" Dec 05 20:20:32 crc kubenswrapper[4885]: I1205 20:20:32.463266 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9339b513-f7aa-4ad6-9e87-b585e81c0577-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c\" (UID: \"9339b513-f7aa-4ad6-9e87-b585e81c0577\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" Dec 05 20:20:32 crc kubenswrapper[4885]: I1205 20:20:32.463312 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9339b513-f7aa-4ad6-9e87-b585e81c0577-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c\" (UID: \"9339b513-f7aa-4ad6-9e87-b585e81c0577\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" Dec 05 20:20:32 crc kubenswrapper[4885]: I1205 20:20:32.463813 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9339b513-f7aa-4ad6-9e87-b585e81c0577-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c\" (UID: \"9339b513-f7aa-4ad6-9e87-b585e81c0577\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" Dec 05 20:20:32 crc kubenswrapper[4885]: I1205 20:20:32.464053 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9339b513-f7aa-4ad6-9e87-b585e81c0577-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c\" (UID: \"9339b513-f7aa-4ad6-9e87-b585e81c0577\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" Dec 05 20:20:32 crc kubenswrapper[4885]: I1205 20:20:32.491416 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgs45\" (UniqueName: \"kubernetes.io/projected/9339b513-f7aa-4ad6-9e87-b585e81c0577-kube-api-access-qgs45\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c\" (UID: \"9339b513-f7aa-4ad6-9e87-b585e81c0577\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" Dec 05 20:20:32 crc kubenswrapper[4885]: I1205 20:20:32.587494 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" Dec 05 20:20:33 crc kubenswrapper[4885]: I1205 20:20:33.077932 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c"] Dec 05 20:20:33 crc kubenswrapper[4885]: I1205 20:20:33.838129 4885 generic.go:334] "Generic (PLEG): container finished" podID="9339b513-f7aa-4ad6-9e87-b585e81c0577" containerID="f7e2b557851170767d751007f877d51fecaec82a4cd84b690e440a498f62af8a" exitCode=0 Dec 05 20:20:33 crc kubenswrapper[4885]: I1205 20:20:33.838193 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" event={"ID":"9339b513-f7aa-4ad6-9e87-b585e81c0577","Type":"ContainerDied","Data":"f7e2b557851170767d751007f877d51fecaec82a4cd84b690e440a498f62af8a"} Dec 05 20:20:33 crc kubenswrapper[4885]: I1205 20:20:33.838234 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" event={"ID":"9339b513-f7aa-4ad6-9e87-b585e81c0577","Type":"ContainerStarted","Data":"f049e0dc8f5b3e0dba24c23bcf546dbc745f297494f9c366dff6bff6a2466857"} Dec 05 20:20:34 crc kubenswrapper[4885]: I1205 20:20:34.848121 4885 generic.go:334] "Generic (PLEG): container finished" podID="9339b513-f7aa-4ad6-9e87-b585e81c0577" containerID="e6bf98bc108d3d6ba86e82e824cb29d2118663a69fc1f01ba97df92f39e4589a" exitCode=0 Dec 05 20:20:34 crc kubenswrapper[4885]: I1205 20:20:34.848294 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" event={"ID":"9339b513-f7aa-4ad6-9e87-b585e81c0577","Type":"ContainerDied","Data":"e6bf98bc108d3d6ba86e82e824cb29d2118663a69fc1f01ba97df92f39e4589a"} Dec 05 20:20:35 crc kubenswrapper[4885]: I1205 20:20:35.860342 4885 generic.go:334] "Generic (PLEG): container finished" podID="9339b513-f7aa-4ad6-9e87-b585e81c0577" containerID="5d07b94edb58ce725007fe980417668a55ab9fdd8d8c59d975195d6f5ee6cac4" exitCode=0 Dec 05 20:20:35 crc kubenswrapper[4885]: I1205 20:20:35.860705 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" event={"ID":"9339b513-f7aa-4ad6-9e87-b585e81c0577","Type":"ContainerDied","Data":"5d07b94edb58ce725007fe980417668a55ab9fdd8d8c59d975195d6f5ee6cac4"} Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.028814 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bmr26"] Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.030500 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.053633 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bmr26"] Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.151679 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zhtg\" (UniqueName: \"kubernetes.io/projected/44a21534-bc6c-49aa-bdfb-4fea90123708-kube-api-access-6zhtg\") pod \"certified-operators-bmr26\" (UID: \"44a21534-bc6c-49aa-bdfb-4fea90123708\") " pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.151738 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a21534-bc6c-49aa-bdfb-4fea90123708-utilities\") pod \"certified-operators-bmr26\" (UID: \"44a21534-bc6c-49aa-bdfb-4fea90123708\") " pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.151819 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a21534-bc6c-49aa-bdfb-4fea90123708-catalog-content\") pod \"certified-operators-bmr26\" (UID: \"44a21534-bc6c-49aa-bdfb-4fea90123708\") " pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.153045 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.252991 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgs45\" (UniqueName: \"kubernetes.io/projected/9339b513-f7aa-4ad6-9e87-b585e81c0577-kube-api-access-qgs45\") pod \"9339b513-f7aa-4ad6-9e87-b585e81c0577\" (UID: \"9339b513-f7aa-4ad6-9e87-b585e81c0577\") " Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.253090 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9339b513-f7aa-4ad6-9e87-b585e81c0577-util\") pod \"9339b513-f7aa-4ad6-9e87-b585e81c0577\" (UID: \"9339b513-f7aa-4ad6-9e87-b585e81c0577\") " Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.253175 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9339b513-f7aa-4ad6-9e87-b585e81c0577-bundle\") pod \"9339b513-f7aa-4ad6-9e87-b585e81c0577\" (UID: \"9339b513-f7aa-4ad6-9e87-b585e81c0577\") " Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.253567 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zhtg\" (UniqueName: \"kubernetes.io/projected/44a21534-bc6c-49aa-bdfb-4fea90123708-kube-api-access-6zhtg\") pod \"certified-operators-bmr26\" (UID: \"44a21534-bc6c-49aa-bdfb-4fea90123708\") " pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.253599 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a21534-bc6c-49aa-bdfb-4fea90123708-utilities\") pod \"certified-operators-bmr26\" (UID: \"44a21534-bc6c-49aa-bdfb-4fea90123708\") " pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.253659 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a21534-bc6c-49aa-bdfb-4fea90123708-catalog-content\") pod \"certified-operators-bmr26\" (UID: \"44a21534-bc6c-49aa-bdfb-4fea90123708\") " pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.254109 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a21534-bc6c-49aa-bdfb-4fea90123708-catalog-content\") pod \"certified-operators-bmr26\" (UID: \"44a21534-bc6c-49aa-bdfb-4fea90123708\") " pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.254475 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9339b513-f7aa-4ad6-9e87-b585e81c0577-bundle" (OuterVolumeSpecName: "bundle") pod "9339b513-f7aa-4ad6-9e87-b585e81c0577" (UID: "9339b513-f7aa-4ad6-9e87-b585e81c0577"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.254699 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a21534-bc6c-49aa-bdfb-4fea90123708-utilities\") pod \"certified-operators-bmr26\" (UID: \"44a21534-bc6c-49aa-bdfb-4fea90123708\") " pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.259477 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9339b513-f7aa-4ad6-9e87-b585e81c0577-kube-api-access-qgs45" (OuterVolumeSpecName: "kube-api-access-qgs45") pod "9339b513-f7aa-4ad6-9e87-b585e81c0577" (UID: "9339b513-f7aa-4ad6-9e87-b585e81c0577"). InnerVolumeSpecName "kube-api-access-qgs45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.266350 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9339b513-f7aa-4ad6-9e87-b585e81c0577-util" (OuterVolumeSpecName: "util") pod "9339b513-f7aa-4ad6-9e87-b585e81c0577" (UID: "9339b513-f7aa-4ad6-9e87-b585e81c0577"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.272636 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zhtg\" (UniqueName: \"kubernetes.io/projected/44a21534-bc6c-49aa-bdfb-4fea90123708-kube-api-access-6zhtg\") pod \"certified-operators-bmr26\" (UID: \"44a21534-bc6c-49aa-bdfb-4fea90123708\") " pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.354649 4885 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9339b513-f7aa-4ad6-9e87-b585e81c0577-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.354692 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgs45\" (UniqueName: \"kubernetes.io/projected/9339b513-f7aa-4ad6-9e87-b585e81c0577-kube-api-access-qgs45\") on node \"crc\" DevicePath \"\"" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.354706 4885 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9339b513-f7aa-4ad6-9e87-b585e81c0577-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.373601 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:37 crc kubenswrapper[4885]: W1205 20:20:37.813933 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44a21534_bc6c_49aa_bdfb_4fea90123708.slice/crio-e637e9331134b087f772eaf437d266a272bcdbac7329e492e9744bb7c42ccc27 WatchSource:0}: Error finding container e637e9331134b087f772eaf437d266a272bcdbac7329e492e9744bb7c42ccc27: Status 404 returned error can't find the container with id e637e9331134b087f772eaf437d266a272bcdbac7329e492e9744bb7c42ccc27 Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.817073 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bmr26"] Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.872974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmr26" event={"ID":"44a21534-bc6c-49aa-bdfb-4fea90123708","Type":"ContainerStarted","Data":"e637e9331134b087f772eaf437d266a272bcdbac7329e492e9744bb7c42ccc27"} Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.875230 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" event={"ID":"9339b513-f7aa-4ad6-9e87-b585e81c0577","Type":"ContainerDied","Data":"f049e0dc8f5b3e0dba24c23bcf546dbc745f297494f9c366dff6bff6a2466857"} Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.875262 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f049e0dc8f5b3e0dba24c23bcf546dbc745f297494f9c366dff6bff6a2466857" Dec 05 20:20:37 crc kubenswrapper[4885]: I1205 20:20:37.875344 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c" Dec 05 20:20:38 crc kubenswrapper[4885]: I1205 20:20:38.884891 4885 generic.go:334] "Generic (PLEG): container finished" podID="44a21534-bc6c-49aa-bdfb-4fea90123708" containerID="b3140dc280491fb36709eb05424f1ddc07154ef39c4f49efbbcec371d2dd3cc6" exitCode=0 Dec 05 20:20:38 crc kubenswrapper[4885]: I1205 20:20:38.884943 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmr26" event={"ID":"44a21534-bc6c-49aa-bdfb-4fea90123708","Type":"ContainerDied","Data":"b3140dc280491fb36709eb05424f1ddc07154ef39c4f49efbbcec371d2dd3cc6"} Dec 05 20:20:39 crc kubenswrapper[4885]: I1205 20:20:39.894351 4885 generic.go:334] "Generic (PLEG): container finished" podID="44a21534-bc6c-49aa-bdfb-4fea90123708" containerID="8189c0f00c2a7bfc6c6c05c72220f32ec69ae65f5c20b831cf22b3f5a22b6cfb" exitCode=0 Dec 05 20:20:39 crc kubenswrapper[4885]: I1205 20:20:39.894553 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmr26" event={"ID":"44a21534-bc6c-49aa-bdfb-4fea90123708","Type":"ContainerDied","Data":"8189c0f00c2a7bfc6c6c05c72220f32ec69ae65f5c20b831cf22b3f5a22b6cfb"} Dec 05 20:20:40 crc kubenswrapper[4885]: I1205 20:20:40.902358 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmr26" event={"ID":"44a21534-bc6c-49aa-bdfb-4fea90123708","Type":"ContainerStarted","Data":"32359760a23136b4ecb6851fb81805b1cbcd33eca49cc924c01a5c2974d945e1"} Dec 05 20:20:40 crc kubenswrapper[4885]: I1205 20:20:40.925765 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bmr26" podStartSLOduration=2.528591455 podStartE2EDuration="3.925744946s" podCreationTimestamp="2025-12-05 20:20:37 +0000 UTC" firstStartedPulling="2025-12-05 20:20:38.888532463 +0000 UTC m=+904.185348164" lastFinishedPulling="2025-12-05 20:20:40.285685994 +0000 UTC m=+905.582501655" observedRunningTime="2025-12-05 20:20:40.920595822 +0000 UTC m=+906.217411493" watchObservedRunningTime="2025-12-05 20:20:40.925744946 +0000 UTC m=+906.222560627" Dec 05 20:20:41 crc kubenswrapper[4885]: I1205 20:20:41.293580 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-qk2s7"] Dec 05 20:20:41 crc kubenswrapper[4885]: E1205 20:20:41.294166 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9339b513-f7aa-4ad6-9e87-b585e81c0577" containerName="extract" Dec 05 20:20:41 crc kubenswrapper[4885]: I1205 20:20:41.294257 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9339b513-f7aa-4ad6-9e87-b585e81c0577" containerName="extract" Dec 05 20:20:41 crc kubenswrapper[4885]: E1205 20:20:41.294350 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9339b513-f7aa-4ad6-9e87-b585e81c0577" containerName="util" Dec 05 20:20:41 crc kubenswrapper[4885]: I1205 20:20:41.294425 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9339b513-f7aa-4ad6-9e87-b585e81c0577" containerName="util" Dec 05 20:20:41 crc kubenswrapper[4885]: E1205 20:20:41.294502 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9339b513-f7aa-4ad6-9e87-b585e81c0577" containerName="pull" Dec 05 20:20:41 crc kubenswrapper[4885]: I1205 20:20:41.294568 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9339b513-f7aa-4ad6-9e87-b585e81c0577" containerName="pull" Dec 05 20:20:41 crc kubenswrapper[4885]: I1205 20:20:41.294789 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9339b513-f7aa-4ad6-9e87-b585e81c0577" containerName="extract" Dec 05 20:20:41 crc kubenswrapper[4885]: I1205 20:20:41.295421 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qk2s7" Dec 05 20:20:41 crc kubenswrapper[4885]: I1205 20:20:41.298207 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-6564w" Dec 05 20:20:41 crc kubenswrapper[4885]: I1205 20:20:41.338964 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-qk2s7"] Dec 05 20:20:41 crc kubenswrapper[4885]: I1205 20:20:41.422160 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgcxp\" (UniqueName: \"kubernetes.io/projected/15ce450d-0098-4b25-afd2-5bda05cfb5b0-kube-api-access-xgcxp\") pod \"openstack-operator-controller-operator-55b6fb9447-qk2s7\" (UID: \"15ce450d-0098-4b25-afd2-5bda05cfb5b0\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qk2s7" Dec 05 20:20:41 crc kubenswrapper[4885]: I1205 20:20:41.524088 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgcxp\" (UniqueName: \"kubernetes.io/projected/15ce450d-0098-4b25-afd2-5bda05cfb5b0-kube-api-access-xgcxp\") pod \"openstack-operator-controller-operator-55b6fb9447-qk2s7\" (UID: \"15ce450d-0098-4b25-afd2-5bda05cfb5b0\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qk2s7" Dec 05 20:20:41 crc kubenswrapper[4885]: I1205 20:20:41.573045 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgcxp\" (UniqueName: \"kubernetes.io/projected/15ce450d-0098-4b25-afd2-5bda05cfb5b0-kube-api-access-xgcxp\") pod \"openstack-operator-controller-operator-55b6fb9447-qk2s7\" (UID: \"15ce450d-0098-4b25-afd2-5bda05cfb5b0\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qk2s7" Dec 05 20:20:41 crc kubenswrapper[4885]: I1205 20:20:41.611168 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qk2s7" Dec 05 20:20:42 crc kubenswrapper[4885]: I1205 20:20:42.054742 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-qk2s7"] Dec 05 20:20:42 crc kubenswrapper[4885]: W1205 20:20:42.064302 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15ce450d_0098_4b25_afd2_5bda05cfb5b0.slice/crio-faf3caafa2023bac3d6c4edd47086a4f9cc23f8a096a0109292cd2e4dd0aad28 WatchSource:0}: Error finding container faf3caafa2023bac3d6c4edd47086a4f9cc23f8a096a0109292cd2e4dd0aad28: Status 404 returned error can't find the container with id faf3caafa2023bac3d6c4edd47086a4f9cc23f8a096a0109292cd2e4dd0aad28 Dec 05 20:20:42 crc kubenswrapper[4885]: I1205 20:20:42.924190 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qk2s7" event={"ID":"15ce450d-0098-4b25-afd2-5bda05cfb5b0","Type":"ContainerStarted","Data":"faf3caafa2023bac3d6c4edd47086a4f9cc23f8a096a0109292cd2e4dd0aad28"} Dec 05 20:20:46 crc kubenswrapper[4885]: I1205 20:20:46.630849 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:20:46 crc kubenswrapper[4885]: I1205 20:20:46.631207 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:20:46 crc kubenswrapper[4885]: I1205 20:20:46.970472 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qk2s7" event={"ID":"15ce450d-0098-4b25-afd2-5bda05cfb5b0","Type":"ContainerStarted","Data":"d247281a84a3b398fdb47bdd433de063d8786507064e80d974aa9f8a23f31b9a"} Dec 05 20:20:46 crc kubenswrapper[4885]: I1205 20:20:46.970649 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qk2s7" Dec 05 20:20:47 crc kubenswrapper[4885]: I1205 20:20:47.006155 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qk2s7" podStartSLOduration=1.7279484649999999 podStartE2EDuration="6.006128034s" podCreationTimestamp="2025-12-05 20:20:41 +0000 UTC" firstStartedPulling="2025-12-05 20:20:42.065591623 +0000 UTC m=+907.362407304" lastFinishedPulling="2025-12-05 20:20:46.343771212 +0000 UTC m=+911.640586873" observedRunningTime="2025-12-05 20:20:46.998227482 +0000 UTC m=+912.295043143" watchObservedRunningTime="2025-12-05 20:20:47.006128034 +0000 UTC m=+912.302943695" Dec 05 20:20:47 crc kubenswrapper[4885]: I1205 20:20:47.373806 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:47 crc kubenswrapper[4885]: I1205 20:20:47.373902 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:47 crc kubenswrapper[4885]: I1205 20:20:47.435048 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:48 crc kubenswrapper[4885]: I1205 20:20:48.023663 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:49 crc kubenswrapper[4885]: I1205 20:20:49.817273 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bmr26"] Dec 05 20:20:49 crc kubenswrapper[4885]: I1205 20:20:49.988338 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bmr26" podUID="44a21534-bc6c-49aa-bdfb-4fea90123708" containerName="registry-server" containerID="cri-o://32359760a23136b4ecb6851fb81805b1cbcd33eca49cc924c01a5c2974d945e1" gracePeriod=2 Dec 05 20:20:50 crc kubenswrapper[4885]: I1205 20:20:50.995275 4885 generic.go:334] "Generic (PLEG): container finished" podID="44a21534-bc6c-49aa-bdfb-4fea90123708" containerID="32359760a23136b4ecb6851fb81805b1cbcd33eca49cc924c01a5c2974d945e1" exitCode=0 Dec 05 20:20:50 crc kubenswrapper[4885]: I1205 20:20:50.995316 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmr26" event={"ID":"44a21534-bc6c-49aa-bdfb-4fea90123708","Type":"ContainerDied","Data":"32359760a23136b4ecb6851fb81805b1cbcd33eca49cc924c01a5c2974d945e1"} Dec 05 20:20:51 crc kubenswrapper[4885]: I1205 20:20:51.458198 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:51 crc kubenswrapper[4885]: I1205 20:20:51.566396 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a21534-bc6c-49aa-bdfb-4fea90123708-utilities\") pod \"44a21534-bc6c-49aa-bdfb-4fea90123708\" (UID: \"44a21534-bc6c-49aa-bdfb-4fea90123708\") " Dec 05 20:20:51 crc kubenswrapper[4885]: I1205 20:20:51.566551 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a21534-bc6c-49aa-bdfb-4fea90123708-catalog-content\") pod \"44a21534-bc6c-49aa-bdfb-4fea90123708\" (UID: \"44a21534-bc6c-49aa-bdfb-4fea90123708\") " Dec 05 20:20:51 crc kubenswrapper[4885]: I1205 20:20:51.566604 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zhtg\" (UniqueName: \"kubernetes.io/projected/44a21534-bc6c-49aa-bdfb-4fea90123708-kube-api-access-6zhtg\") pod \"44a21534-bc6c-49aa-bdfb-4fea90123708\" (UID: \"44a21534-bc6c-49aa-bdfb-4fea90123708\") " Dec 05 20:20:51 crc kubenswrapper[4885]: I1205 20:20:51.568051 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44a21534-bc6c-49aa-bdfb-4fea90123708-utilities" (OuterVolumeSpecName: "utilities") pod "44a21534-bc6c-49aa-bdfb-4fea90123708" (UID: "44a21534-bc6c-49aa-bdfb-4fea90123708"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:20:51 crc kubenswrapper[4885]: I1205 20:20:51.576782 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a21534-bc6c-49aa-bdfb-4fea90123708-kube-api-access-6zhtg" (OuterVolumeSpecName: "kube-api-access-6zhtg") pod "44a21534-bc6c-49aa-bdfb-4fea90123708" (UID: "44a21534-bc6c-49aa-bdfb-4fea90123708"). InnerVolumeSpecName "kube-api-access-6zhtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:20:51 crc kubenswrapper[4885]: I1205 20:20:51.616720 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qk2s7" Dec 05 20:20:51 crc kubenswrapper[4885]: I1205 20:20:51.617504 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44a21534-bc6c-49aa-bdfb-4fea90123708-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44a21534-bc6c-49aa-bdfb-4fea90123708" (UID: "44a21534-bc6c-49aa-bdfb-4fea90123708"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:20:51 crc kubenswrapper[4885]: I1205 20:20:51.667770 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a21534-bc6c-49aa-bdfb-4fea90123708-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:20:51 crc kubenswrapper[4885]: I1205 20:20:51.667800 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zhtg\" (UniqueName: \"kubernetes.io/projected/44a21534-bc6c-49aa-bdfb-4fea90123708-kube-api-access-6zhtg\") on node \"crc\" DevicePath \"\"" Dec 05 20:20:51 crc kubenswrapper[4885]: I1205 20:20:51.667811 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a21534-bc6c-49aa-bdfb-4fea90123708-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:20:52 crc kubenswrapper[4885]: I1205 20:20:52.003062 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmr26" event={"ID":"44a21534-bc6c-49aa-bdfb-4fea90123708","Type":"ContainerDied","Data":"e637e9331134b087f772eaf437d266a272bcdbac7329e492e9744bb7c42ccc27"} Dec 05 20:20:52 crc kubenswrapper[4885]: I1205 20:20:52.003111 4885 scope.go:117] "RemoveContainer" containerID="32359760a23136b4ecb6851fb81805b1cbcd33eca49cc924c01a5c2974d945e1" Dec 05 20:20:52 crc kubenswrapper[4885]: I1205 20:20:52.003131 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmr26" Dec 05 20:20:52 crc kubenswrapper[4885]: I1205 20:20:52.018685 4885 scope.go:117] "RemoveContainer" containerID="8189c0f00c2a7bfc6c6c05c72220f32ec69ae65f5c20b831cf22b3f5a22b6cfb" Dec 05 20:20:52 crc kubenswrapper[4885]: I1205 20:20:52.036085 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bmr26"] Dec 05 20:20:52 crc kubenswrapper[4885]: I1205 20:20:52.039357 4885 scope.go:117] "RemoveContainer" containerID="b3140dc280491fb36709eb05424f1ddc07154ef39c4f49efbbcec371d2dd3cc6" Dec 05 20:20:52 crc kubenswrapper[4885]: I1205 20:20:52.039987 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bmr26"] Dec 05 20:20:53 crc kubenswrapper[4885]: I1205 20:20:53.178854 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44a21534-bc6c-49aa-bdfb-4fea90123708" path="/var/lib/kubelet/pods/44a21534-bc6c-49aa-bdfb-4fea90123708/volumes" Dec 05 20:21:09 crc kubenswrapper[4885]: I1205 20:21:09.948918 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-cqj46"] Dec 05 20:21:09 crc kubenswrapper[4885]: E1205 20:21:09.949692 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a21534-bc6c-49aa-bdfb-4fea90123708" containerName="registry-server" Dec 05 20:21:09 crc kubenswrapper[4885]: I1205 20:21:09.949706 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a21534-bc6c-49aa-bdfb-4fea90123708" containerName="registry-server" Dec 05 20:21:09 crc kubenswrapper[4885]: E1205 20:21:09.949726 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a21534-bc6c-49aa-bdfb-4fea90123708" containerName="extract-utilities" Dec 05 20:21:09 crc kubenswrapper[4885]: I1205 20:21:09.949734 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a21534-bc6c-49aa-bdfb-4fea90123708" containerName="extract-utilities" Dec 05 20:21:09 crc kubenswrapper[4885]: E1205 20:21:09.949745 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a21534-bc6c-49aa-bdfb-4fea90123708" containerName="extract-content" Dec 05 20:21:09 crc kubenswrapper[4885]: I1205 20:21:09.949754 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a21534-bc6c-49aa-bdfb-4fea90123708" containerName="extract-content" Dec 05 20:21:09 crc kubenswrapper[4885]: I1205 20:21:09.949874 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a21534-bc6c-49aa-bdfb-4fea90123708" containerName="registry-server" Dec 05 20:21:09 crc kubenswrapper[4885]: I1205 20:21:09.950616 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-cqj46" Dec 05 20:21:09 crc kubenswrapper[4885]: I1205 20:21:09.959402 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-cqj46"] Dec 05 20:21:09 crc kubenswrapper[4885]: I1205 20:21:09.959976 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-2v5bs" Dec 05 20:21:09 crc kubenswrapper[4885]: I1205 20:21:09.966734 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-s4ftd"] Dec 05 20:21:09 crc kubenswrapper[4885]: I1205 20:21:09.967855 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-s4ftd" Dec 05 20:21:09 crc kubenswrapper[4885]: I1205 20:21:09.969921 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-nsdjh" Dec 05 20:21:09 crc kubenswrapper[4885]: I1205 20:21:09.980633 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-nqshj"] Dec 05 20:21:09 crc kubenswrapper[4885]: I1205 20:21:09.981680 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmpmq\" (UniqueName: \"kubernetes.io/projected/74869c39-a4c4-4812-8656-4751d25ef987-kube-api-access-cmpmq\") pod \"barbican-operator-controller-manager-7d9dfd778-cqj46\" (UID: \"74869c39-a4c4-4812-8656-4751d25ef987\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-cqj46" Dec 05 20:21:09 crc kubenswrapper[4885]: I1205 20:21:09.981805 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-nqshj" Dec 05 20:21:09 crc kubenswrapper[4885]: I1205 20:21:09.985348 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-h6z2b" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.003733 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-s4ftd"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.009088 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-nqshj"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.013079 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kgdg2"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.014112 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kgdg2" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.016503 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-5xw2c" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.023115 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-rqh2l"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.024107 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-rqh2l" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.026043 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-b7n5p" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.027823 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kgdg2"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.036587 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-rqh2l"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.052611 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.054313 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.062193 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-zz7df"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.062453 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.062705 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-xwwz4" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.063290 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-zz7df" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.068956 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-lcrmt" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.076194 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-zz7df"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.082216 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.082599 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxjcv\" (UniqueName: \"kubernetes.io/projected/9034e951-dbbb-4927-b9fa-fa2e83c1595c-kube-api-access-vxjcv\") pod \"heat-operator-controller-manager-5f64f6f8bb-kgdg2\" (UID: \"9034e951-dbbb-4927-b9fa-fa2e83c1595c\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kgdg2" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.082652 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkrmc\" (UniqueName: \"kubernetes.io/projected/c942221f-6ad2-4109-9975-ec8054686283-kube-api-access-dkrmc\") pod \"glance-operator-controller-manager-77987cd8cd-rqh2l\" (UID: \"c942221f-6ad2-4109-9975-ec8054686283\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-rqh2l" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.082692 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmpmq\" (UniqueName: \"kubernetes.io/projected/74869c39-a4c4-4812-8656-4751d25ef987-kube-api-access-cmpmq\") pod \"barbican-operator-controller-manager-7d9dfd778-cqj46\" (UID: \"74869c39-a4c4-4812-8656-4751d25ef987\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-cqj46" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.082714 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mblcc\" (UniqueName: \"kubernetes.io/projected/6a0f526a-c496-478e-bc4c-e6478ebeb3ea-kube-api-access-mblcc\") pod \"designate-operator-controller-manager-78b4bc895b-nqshj\" (UID: \"6a0f526a-c496-478e-bc4c-e6478ebeb3ea\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-nqshj" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.082768 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72bsr\" (UniqueName: \"kubernetes.io/projected/93741f1b-6823-4374-927f-38d95ba139f5-kube-api-access-72bsr\") pod \"cinder-operator-controller-manager-859b6ccc6-s4ftd\" (UID: \"93741f1b-6823-4374-927f-38d95ba139f5\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-s4ftd" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.106189 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-z27c2"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.107162 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-z27c2" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.112333 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-r6ljq"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.112369 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-z58hp" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.126728 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-r6ljq" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.139722 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-7pw48" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.151453 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmpmq\" (UniqueName: \"kubernetes.io/projected/74869c39-a4c4-4812-8656-4751d25ef987-kube-api-access-cmpmq\") pod \"barbican-operator-controller-manager-7d9dfd778-cqj46\" (UID: \"74869c39-a4c4-4812-8656-4751d25ef987\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-cqj46" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.202088 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-r6ljq"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.205411 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzg75\" (UniqueName: \"kubernetes.io/projected/ee66e99c-4761-43a5-a55c-b28957859913-kube-api-access-pzg75\") pod \"horizon-operator-controller-manager-68c6d99b8f-zz7df\" (UID: \"ee66e99c-4761-43a5-a55c-b28957859913\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-zz7df" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.205462 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72bsr\" (UniqueName: \"kubernetes.io/projected/93741f1b-6823-4374-927f-38d95ba139f5-kube-api-access-72bsr\") pod \"cinder-operator-controller-manager-859b6ccc6-s4ftd\" (UID: \"93741f1b-6823-4374-927f-38d95ba139f5\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-s4ftd" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.205499 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxjcv\" (UniqueName: \"kubernetes.io/projected/9034e951-dbbb-4927-b9fa-fa2e83c1595c-kube-api-access-vxjcv\") pod \"heat-operator-controller-manager-5f64f6f8bb-kgdg2\" (UID: \"9034e951-dbbb-4927-b9fa-fa2e83c1595c\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kgdg2" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.206309 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkrmc\" (UniqueName: \"kubernetes.io/projected/c942221f-6ad2-4109-9975-ec8054686283-kube-api-access-dkrmc\") pod \"glance-operator-controller-manager-77987cd8cd-rqh2l\" (UID: \"c942221f-6ad2-4109-9975-ec8054686283\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-rqh2l" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.206363 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgxjw\" (UniqueName: \"kubernetes.io/projected/f9775930-6d69-4ad4-a249-f5d2f270b365-kube-api-access-wgxjw\") pod \"infra-operator-controller-manager-57548d458d-dpqcg\" (UID: \"f9775930-6d69-4ad4-a249-f5d2f270b365\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.206408 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mblcc\" (UniqueName: \"kubernetes.io/projected/6a0f526a-c496-478e-bc4c-e6478ebeb3ea-kube-api-access-mblcc\") pod \"designate-operator-controller-manager-78b4bc895b-nqshj\" (UID: \"6a0f526a-c496-478e-bc4c-e6478ebeb3ea\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-nqshj" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.206443 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dgcl\" (UniqueName: \"kubernetes.io/projected/da47cf7f-37ab-4d5d-99b1-1b312002f83e-kube-api-access-7dgcl\") pod \"keystone-operator-controller-manager-7765d96ddf-r6ljq\" (UID: \"da47cf7f-37ab-4d5d-99b1-1b312002f83e\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-r6ljq" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.206498 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert\") pod \"infra-operator-controller-manager-57548d458d-dpqcg\" (UID: \"f9775930-6d69-4ad4-a249-f5d2f270b365\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.218086 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-z27c2"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.256586 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72bsr\" (UniqueName: \"kubernetes.io/projected/93741f1b-6823-4374-927f-38d95ba139f5-kube-api-access-72bsr\") pod \"cinder-operator-controller-manager-859b6ccc6-s4ftd\" (UID: \"93741f1b-6823-4374-927f-38d95ba139f5\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-s4ftd" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.259798 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-4vb99"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.260754 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4vb99" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.261675 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mblcc\" (UniqueName: \"kubernetes.io/projected/6a0f526a-c496-478e-bc4c-e6478ebeb3ea-kube-api-access-mblcc\") pod \"designate-operator-controller-manager-78b4bc895b-nqshj\" (UID: \"6a0f526a-c496-478e-bc4c-e6478ebeb3ea\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-nqshj" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.266650 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkrmc\" (UniqueName: \"kubernetes.io/projected/c942221f-6ad2-4109-9975-ec8054686283-kube-api-access-dkrmc\") pod \"glance-operator-controller-manager-77987cd8cd-rqh2l\" (UID: \"c942221f-6ad2-4109-9975-ec8054686283\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-rqh2l" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.266914 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-cqj46" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.268295 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-9mcgf" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.269401 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hkw2j"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.272891 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hkw2j" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.285170 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-v4cwf" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.287474 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-s4ftd" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.292662 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxjcv\" (UniqueName: \"kubernetes.io/projected/9034e951-dbbb-4927-b9fa-fa2e83c1595c-kube-api-access-vxjcv\") pod \"heat-operator-controller-manager-5f64f6f8bb-kgdg2\" (UID: \"9034e951-dbbb-4927-b9fa-fa2e83c1595c\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kgdg2" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.304012 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-nqshj" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.308704 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert\") pod \"infra-operator-controller-manager-57548d458d-dpqcg\" (UID: \"f9775930-6d69-4ad4-a249-f5d2f270b365\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.308766 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzg75\" (UniqueName: \"kubernetes.io/projected/ee66e99c-4761-43a5-a55c-b28957859913-kube-api-access-pzg75\") pod \"horizon-operator-controller-manager-68c6d99b8f-zz7df\" (UID: \"ee66e99c-4761-43a5-a55c-b28957859913\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-zz7df" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.308821 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b59r\" (UniqueName: \"kubernetes.io/projected/741c1713-f931-471e-ad95-99d16600ab76-kube-api-access-6b59r\") pod \"ironic-operator-controller-manager-6c548fd776-z27c2\" (UID: \"741c1713-f931-471e-ad95-99d16600ab76\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-z27c2" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.308864 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgxjw\" (UniqueName: \"kubernetes.io/projected/f9775930-6d69-4ad4-a249-f5d2f270b365-kube-api-access-wgxjw\") pod \"infra-operator-controller-manager-57548d458d-dpqcg\" (UID: \"f9775930-6d69-4ad4-a249-f5d2f270b365\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.308910 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dgcl\" (UniqueName: \"kubernetes.io/projected/da47cf7f-37ab-4d5d-99b1-1b312002f83e-kube-api-access-7dgcl\") pod \"keystone-operator-controller-manager-7765d96ddf-r6ljq\" (UID: \"da47cf7f-37ab-4d5d-99b1-1b312002f83e\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-r6ljq" Dec 05 20:21:10 crc kubenswrapper[4885]: E1205 20:21:10.313050 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:21:10 crc kubenswrapper[4885]: E1205 20:21:10.313244 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert podName:f9775930-6d69-4ad4-a249-f5d2f270b365 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:10.813128633 +0000 UTC m=+936.109944294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert") pod "infra-operator-controller-manager-57548d458d-dpqcg" (UID: "f9775930-6d69-4ad4-a249-f5d2f270b365") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.320874 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.322715 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.342905 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-hwpd9" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.344534 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.347526 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.347549 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kgdg2" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.348795 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hkw2j"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.349800 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-rqh2l" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.361701 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dgcl\" (UniqueName: \"kubernetes.io/projected/da47cf7f-37ab-4d5d-99b1-1b312002f83e-kube-api-access-7dgcl\") pod \"keystone-operator-controller-manager-7765d96ddf-r6ljq\" (UID: \"da47cf7f-37ab-4d5d-99b1-1b312002f83e\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-r6ljq" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.364960 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgxjw\" (UniqueName: \"kubernetes.io/projected/f9775930-6d69-4ad4-a249-f5d2f270b365-kube-api-access-wgxjw\") pod \"infra-operator-controller-manager-57548d458d-dpqcg\" (UID: \"f9775930-6d69-4ad4-a249-f5d2f270b365\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.366585 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzg75\" (UniqueName: \"kubernetes.io/projected/ee66e99c-4761-43a5-a55c-b28957859913-kube-api-access-pzg75\") pod \"horizon-operator-controller-manager-68c6d99b8f-zz7df\" (UID: \"ee66e99c-4761-43a5-a55c-b28957859913\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-zz7df" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.366991 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-f6crw" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.384795 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.395873 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-zz7df" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.413966 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l79g4\" (UniqueName: \"kubernetes.io/projected/3e2eaf31-e16e-4072-ae6b-a5c9eda46732-kube-api-access-l79g4\") pod \"nova-operator-controller-manager-697bc559fc-w5c5m\" (UID: \"3e2eaf31-e16e-4072-ae6b-a5c9eda46732\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.414037 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dxpv\" (UniqueName: \"kubernetes.io/projected/e12a10c6-f52c-4348-bb54-356af7632dd4-kube-api-access-8dxpv\") pod \"mariadb-operator-controller-manager-56bbcc9d85-hkw2j\" (UID: \"e12a10c6-f52c-4348-bb54-356af7632dd4\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hkw2j" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.414055 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5x98\" (UniqueName: \"kubernetes.io/projected/ca2be922-afb3-4640-bdad-cfd3b0164d52-kube-api-access-t5x98\") pod \"manila-operator-controller-manager-7c79b5df47-4vb99\" (UID: \"ca2be922-afb3-4640-bdad-cfd3b0164d52\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4vb99" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.414207 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b59r\" (UniqueName: \"kubernetes.io/projected/741c1713-f931-471e-ad95-99d16600ab76-kube-api-access-6b59r\") pod \"ironic-operator-controller-manager-6c548fd776-z27c2\" (UID: \"741c1713-f931-471e-ad95-99d16600ab76\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-z27c2" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.414294 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc8jl\" (UniqueName: \"kubernetes.io/projected/33f07e6f-9ac8-461d-b455-ad634c2e255c-kube-api-access-qc8jl\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-z4wtk\" (UID: \"33f07e6f-9ac8-461d-b455-ad634c2e255c\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.433091 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-4vb99"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.454007 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b59r\" (UniqueName: \"kubernetes.io/projected/741c1713-f931-471e-ad95-99d16600ab76-kube-api-access-6b59r\") pod \"ironic-operator-controller-manager-6c548fd776-z27c2\" (UID: \"741c1713-f931-471e-ad95-99d16600ab76\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-z27c2" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.477123 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.507034 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-gwtxz"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.510379 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gwtxz" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.512412 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-sw475" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.512794 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-z27c2" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.514948 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l79g4\" (UniqueName: \"kubernetes.io/projected/3e2eaf31-e16e-4072-ae6b-a5c9eda46732-kube-api-access-l79g4\") pod \"nova-operator-controller-manager-697bc559fc-w5c5m\" (UID: \"3e2eaf31-e16e-4072-ae6b-a5c9eda46732\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.514972 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjnhm\" (UniqueName: \"kubernetes.io/projected/aed37ead-6406-43f0-a6f5-4e8864935a58-kube-api-access-vjnhm\") pod \"octavia-operator-controller-manager-998648c74-gwtxz\" (UID: \"aed37ead-6406-43f0-a6f5-4e8864935a58\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-gwtxz" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.515003 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dxpv\" (UniqueName: \"kubernetes.io/projected/e12a10c6-f52c-4348-bb54-356af7632dd4-kube-api-access-8dxpv\") pod \"mariadb-operator-controller-manager-56bbcc9d85-hkw2j\" (UID: \"e12a10c6-f52c-4348-bb54-356af7632dd4\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hkw2j" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.515034 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5x98\" (UniqueName: \"kubernetes.io/projected/ca2be922-afb3-4640-bdad-cfd3b0164d52-kube-api-access-t5x98\") pod \"manila-operator-controller-manager-7c79b5df47-4vb99\" (UID: \"ca2be922-afb3-4640-bdad-cfd3b0164d52\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4vb99" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.515080 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc8jl\" (UniqueName: \"kubernetes.io/projected/33f07e6f-9ac8-461d-b455-ad634c2e255c-kube-api-access-qc8jl\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-z4wtk\" (UID: \"33f07e6f-9ac8-461d-b455-ad634c2e255c\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.531524 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-r6ljq" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.551080 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-gwtxz"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.551880 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l79g4\" (UniqueName: \"kubernetes.io/projected/3e2eaf31-e16e-4072-ae6b-a5c9eda46732-kube-api-access-l79g4\") pod \"nova-operator-controller-manager-697bc559fc-w5c5m\" (UID: \"3e2eaf31-e16e-4072-ae6b-a5c9eda46732\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.557162 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc8jl\" (UniqueName: \"kubernetes.io/projected/33f07e6f-9ac8-461d-b455-ad634c2e255c-kube-api-access-qc8jl\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-z4wtk\" (UID: \"33f07e6f-9ac8-461d-b455-ad634c2e255c\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.562814 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dxpv\" (UniqueName: \"kubernetes.io/projected/e12a10c6-f52c-4348-bb54-356af7632dd4-kube-api-access-8dxpv\") pod \"mariadb-operator-controller-manager-56bbcc9d85-hkw2j\" (UID: \"e12a10c6-f52c-4348-bb54-356af7632dd4\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hkw2j" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.571451 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.572704 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.584552 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-d6skg" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.585143 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.586096 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.590804 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-sfp94" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.594055 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5x98\" (UniqueName: \"kubernetes.io/projected/ca2be922-afb3-4640-bdad-cfd3b0164d52-kube-api-access-t5x98\") pod \"manila-operator-controller-manager-7c79b5df47-4vb99\" (UID: \"ca2be922-afb3-4640-bdad-cfd3b0164d52\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4vb99" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.602618 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.619642 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjnhm\" (UniqueName: \"kubernetes.io/projected/aed37ead-6406-43f0-a6f5-4e8864935a58-kube-api-access-vjnhm\") pod \"octavia-operator-controller-manager-998648c74-gwtxz\" (UID: \"aed37ead-6406-43f0-a6f5-4e8864935a58\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-gwtxz" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.634862 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.643128 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.649696 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjnhm\" (UniqueName: \"kubernetes.io/projected/aed37ead-6406-43f0-a6f5-4e8864935a58-kube-api-access-vjnhm\") pod \"octavia-operator-controller-manager-998648c74-gwtxz\" (UID: \"aed37ead-6406-43f0-a6f5-4e8864935a58\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-gwtxz" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.662817 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gwtxz" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.689737 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.689879 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.692284 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.694573 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-jv2ts" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.696800 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.722223 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g6z6\" (UniqueName: \"kubernetes.io/projected/2eea8037-d11c-47ee-9bc9-67deafc20268-kube-api-access-2g6z6\") pod \"placement-operator-controller-manager-78f8948974-4q2vd\" (UID: \"2eea8037-d11c-47ee-9bc9-67deafc20268\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.722369 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl5ln\" (UniqueName: \"kubernetes.io/projected/06e1a4eb-c6cb-4146-b2f9-484c2e699a7e-kube-api-access-kl5ln\") pod \"ovn-operator-controller-manager-b6456fdb6-t4mch\" (UID: \"06e1a4eb-c6cb-4146-b2f9-484c2e699a7e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.723155 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-t4xtt"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.724159 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-t4xtt" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.729963 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hkw2j" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.731991 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-stdlz" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.740335 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.756436 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.757657 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.759772 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-z5hfj" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.774893 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-t4xtt"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.782689 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.795793 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-565xh"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.796899 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.802903 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-bgqfz" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.808092 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-565xh"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.820001 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4vb99" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.826694 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g6z6\" (UniqueName: \"kubernetes.io/projected/2eea8037-d11c-47ee-9bc9-67deafc20268-kube-api-access-2g6z6\") pod \"placement-operator-controller-manager-78f8948974-4q2vd\" (UID: \"2eea8037-d11c-47ee-9bc9-67deafc20268\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.826840 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert\") pod \"infra-operator-controller-manager-57548d458d-dpqcg\" (UID: \"f9775930-6d69-4ad4-a249-f5d2f270b365\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.826918 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf5xv\" (UniqueName: \"kubernetes.io/projected/c20bdf47-2333-40eb-b5e1-4ad4ad32cdd5-kube-api-access-cf5xv\") pod \"swift-operator-controller-manager-5f8c65bbfc-t4xtt\" (UID: \"c20bdf47-2333-40eb-b5e1-4ad4ad32cdd5\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-t4xtt" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.826954 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl5ln\" (UniqueName: \"kubernetes.io/projected/06e1a4eb-c6cb-4146-b2f9-484c2e699a7e-kube-api-access-kl5ln\") pod \"ovn-operator-controller-manager-b6456fdb6-t4mch\" (UID: \"06e1a4eb-c6cb-4146-b2f9-484c2e699a7e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.827014 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5qfqlm\" (UID: \"fdb3c987-9d79-4920-9b95-1be3a3dbc622\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.827069 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p275\" (UniqueName: \"kubernetes.io/projected/fdb3c987-9d79-4920-9b95-1be3a3dbc622-kube-api-access-4p275\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5qfqlm\" (UID: \"fdb3c987-9d79-4920-9b95-1be3a3dbc622\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" Dec 05 20:21:10 crc kubenswrapper[4885]: E1205 20:21:10.828540 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:21:10 crc kubenswrapper[4885]: E1205 20:21:10.828610 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert podName:f9775930-6d69-4ad4-a249-f5d2f270b365 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:11.828583096 +0000 UTC m=+937.125398757 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert") pod "infra-operator-controller-manager-57548d458d-dpqcg" (UID: "f9775930-6d69-4ad4-a249-f5d2f270b365") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.854330 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl5ln\" (UniqueName: \"kubernetes.io/projected/06e1a4eb-c6cb-4146-b2f9-484c2e699a7e-kube-api-access-kl5ln\") pod \"ovn-operator-controller-manager-b6456fdb6-t4mch\" (UID: \"06e1a4eb-c6cb-4146-b2f9-484c2e699a7e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.858154 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.862671 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.864167 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-fj27n" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.871117 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g6z6\" (UniqueName: \"kubernetes.io/projected/2eea8037-d11c-47ee-9bc9-67deafc20268-kube-api-access-2g6z6\") pod \"placement-operator-controller-manager-78f8948974-4q2vd\" (UID: \"2eea8037-d11c-47ee-9bc9-67deafc20268\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.874518 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.928591 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf5xv\" (UniqueName: \"kubernetes.io/projected/c20bdf47-2333-40eb-b5e1-4ad4ad32cdd5-kube-api-access-cf5xv\") pod \"swift-operator-controller-manager-5f8c65bbfc-t4xtt\" (UID: \"c20bdf47-2333-40eb-b5e1-4ad4ad32cdd5\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-t4xtt" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.928633 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8ftv\" (UniqueName: \"kubernetes.io/projected/f68526b5-c6b6-484e-b476-1e4c76ba71fd-kube-api-access-w8ftv\") pod \"telemetry-operator-controller-manager-76cc84c6bb-rqs2p\" (UID: \"f68526b5-c6b6-484e-b476-1e4c76ba71fd\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.928680 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5qfqlm\" (UID: \"fdb3c987-9d79-4920-9b95-1be3a3dbc622\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.928703 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p275\" (UniqueName: \"kubernetes.io/projected/fdb3c987-9d79-4920-9b95-1be3a3dbc622-kube-api-access-4p275\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5qfqlm\" (UID: \"fdb3c987-9d79-4920-9b95-1be3a3dbc622\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.928737 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpkq4\" (UniqueName: \"kubernetes.io/projected/49b39782-af0e-4f86-89f4-96582b6a8336-kube-api-access-rpkq4\") pod \"test-operator-controller-manager-5854674fcc-565xh\" (UID: \"49b39782-af0e-4f86-89f4-96582b6a8336\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" Dec 05 20:21:10 crc kubenswrapper[4885]: E1205 20:21:10.929265 4885 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:21:10 crc kubenswrapper[4885]: E1205 20:21:10.929307 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert podName:fdb3c987-9d79-4920-9b95-1be3a3dbc622 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:11.429293997 +0000 UTC m=+936.726109658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" (UID: "fdb3c987-9d79-4920-9b95-1be3a3dbc622") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.934294 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.935130 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.937822 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.938095 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-brmc2" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.938538 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.971680 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2"] Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.973110 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf5xv\" (UniqueName: \"kubernetes.io/projected/c20bdf47-2333-40eb-b5e1-4ad4ad32cdd5-kube-api-access-cf5xv\") pod \"swift-operator-controller-manager-5f8c65bbfc-t4xtt\" (UID: \"c20bdf47-2333-40eb-b5e1-4ad4ad32cdd5\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-t4xtt" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.976612 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p275\" (UniqueName: \"kubernetes.io/projected/fdb3c987-9d79-4920-9b95-1be3a3dbc622-kube-api-access-4p275\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5qfqlm\" (UID: \"fdb3c987-9d79-4920-9b95-1be3a3dbc622\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" Dec 05 20:21:10 crc kubenswrapper[4885]: I1205 20:21:10.994915 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.012409 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qpp7t"] Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.014060 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qpp7t" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.017587 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qpp7t"] Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.019990 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-vhh2d" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.029458 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8ftv\" (UniqueName: \"kubernetes.io/projected/f68526b5-c6b6-484e-b476-1e4c76ba71fd-kube-api-access-w8ftv\") pod \"telemetry-operator-controller-manager-76cc84c6bb-rqs2p\" (UID: \"f68526b5-c6b6-484e-b476-1e4c76ba71fd\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.029539 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpkq4\" (UniqueName: \"kubernetes.io/projected/49b39782-af0e-4f86-89f4-96582b6a8336-kube-api-access-rpkq4\") pod \"test-operator-controller-manager-5854674fcc-565xh\" (UID: \"49b39782-af0e-4f86-89f4-96582b6a8336\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.029781 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97hls\" (UniqueName: \"kubernetes.io/projected/f9ccfa3f-a548-4e32-9318-b3f2cb19ccca-kube-api-access-97hls\") pod \"watcher-operator-controller-manager-769dc69bc-nrtkv\" (UID: \"f9ccfa3f-a548-4e32-9318-b3f2cb19ccca\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.047856 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8ftv\" (UniqueName: \"kubernetes.io/projected/f68526b5-c6b6-484e-b476-1e4c76ba71fd-kube-api-access-w8ftv\") pod \"telemetry-operator-controller-manager-76cc84c6bb-rqs2p\" (UID: \"f68526b5-c6b6-484e-b476-1e4c76ba71fd\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.060666 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpkq4\" (UniqueName: \"kubernetes.io/projected/49b39782-af0e-4f86-89f4-96582b6a8336-kube-api-access-rpkq4\") pod \"test-operator-controller-manager-5854674fcc-565xh\" (UID: \"49b39782-af0e-4f86-89f4-96582b6a8336\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.131389 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.131431 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m48w\" (UniqueName: \"kubernetes.io/projected/18cedf03-5e88-4513-b2cc-e364e749f219-kube-api-access-9m48w\") pod \"rabbitmq-cluster-operator-manager-668c99d594-qpp7t\" (UID: \"18cedf03-5e88-4513-b2cc-e364e749f219\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qpp7t" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.131462 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97hls\" (UniqueName: \"kubernetes.io/projected/f9ccfa3f-a548-4e32-9318-b3f2cb19ccca-kube-api-access-97hls\") pod \"watcher-operator-controller-manager-769dc69bc-nrtkv\" (UID: \"f9ccfa3f-a548-4e32-9318-b3f2cb19ccca\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.131495 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.131513 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4kjt\" (UniqueName: \"kubernetes.io/projected/acaad339-be87-48ab-aee8-7f4637190768-kube-api-access-t4kjt\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.138179 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.173083 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97hls\" (UniqueName: \"kubernetes.io/projected/f9ccfa3f-a548-4e32-9318-b3f2cb19ccca-kube-api-access-97hls\") pod \"watcher-operator-controller-manager-769dc69bc-nrtkv\" (UID: \"f9ccfa3f-a548-4e32-9318-b3f2cb19ccca\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.207735 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-t4xtt" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.233138 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.233189 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m48w\" (UniqueName: \"kubernetes.io/projected/18cedf03-5e88-4513-b2cc-e364e749f219-kube-api-access-9m48w\") pod \"rabbitmq-cluster-operator-manager-668c99d594-qpp7t\" (UID: \"18cedf03-5e88-4513-b2cc-e364e749f219\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qpp7t" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.233230 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.233259 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4kjt\" (UniqueName: \"kubernetes.io/projected/acaad339-be87-48ab-aee8-7f4637190768-kube-api-access-t4kjt\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.233497 4885 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.233546 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs podName:acaad339-be87-48ab-aee8-7f4637190768 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:11.733529647 +0000 UTC m=+937.030345308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-b47j2" (UID: "acaad339-be87-48ab-aee8-7f4637190768") : secret "webhook-server-cert" not found Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.233652 4885 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.233684 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs podName:acaad339-be87-48ab-aee8-7f4637190768 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:11.733674262 +0000 UTC m=+937.030489923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-b47j2" (UID: "acaad339-be87-48ab-aee8-7f4637190768") : secret "metrics-server-cert" not found Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.259406 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4kjt\" (UniqueName: \"kubernetes.io/projected/acaad339-be87-48ab-aee8-7f4637190768-kube-api-access-t4kjt\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.264779 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.277437 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.287126 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m48w\" (UniqueName: \"kubernetes.io/projected/18cedf03-5e88-4513-b2cc-e364e749f219-kube-api-access-9m48w\") pod \"rabbitmq-cluster-operator-manager-668c99d594-qpp7t\" (UID: \"18cedf03-5e88-4513-b2cc-e364e749f219\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qpp7t" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.318931 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-s4ftd"] Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.338189 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-nqshj"] Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.348365 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kgdg2"] Dec 05 20:21:11 crc kubenswrapper[4885]: W1205 20:21:11.374327 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a0f526a_c496_478e_bc4c_e6478ebeb3ea.slice/crio-07b3d18ac66b42af1bcd04e8d13e2a3b5bc70d6d341b0c0433b9498fe005184a WatchSource:0}: Error finding container 07b3d18ac66b42af1bcd04e8d13e2a3b5bc70d6d341b0c0433b9498fe005184a: Status 404 returned error can't find the container with id 07b3d18ac66b42af1bcd04e8d13e2a3b5bc70d6d341b0c0433b9498fe005184a Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.379323 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.414635 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qpp7t" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.438383 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5qfqlm\" (UID: \"fdb3c987-9d79-4920-9b95-1be3a3dbc622\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.438479 4885 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.438529 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert podName:fdb3c987-9d79-4920-9b95-1be3a3dbc622 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:12.438513421 +0000 UTC m=+937.735329082 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" (UID: "fdb3c987-9d79-4920-9b95-1be3a3dbc622") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.602596 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-rqh2l"] Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.610310 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m"] Dec 05 20:21:11 crc kubenswrapper[4885]: W1205 20:21:11.611393 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod741c1713_f931_471e_ad95_99d16600ab76.slice/crio-79c289d6abdb704cace0c640d8a61f00362160a39955baf8c739bf5f6c09c75b WatchSource:0}: Error finding container 79c289d6abdb704cace0c640d8a61f00362160a39955baf8c739bf5f6c09c75b: Status 404 returned error can't find the container with id 79c289d6abdb704cace0c640d8a61f00362160a39955baf8c739bf5f6c09c75b Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.619706 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-z27c2"] Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.624929 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-cqj46"] Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.638264 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-zz7df"] Dec 05 20:21:11 crc kubenswrapper[4885]: W1205 20:21:11.651472 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e2eaf31_e16e_4072_ae6b_a5c9eda46732.slice/crio-f1eba5aa3507be8a8d770700d9581598dbee7ee7f80c78170428c3e4bc99bc38 WatchSource:0}: Error finding container f1eba5aa3507be8a8d770700d9581598dbee7ee7f80c78170428c3e4bc99bc38: Status 404 returned error can't find the container with id f1eba5aa3507be8a8d770700d9581598dbee7ee7f80c78170428c3e4bc99bc38 Dec 05 20:21:11 crc kubenswrapper[4885]: W1205 20:21:11.692195 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee66e99c_4761_43a5_a55c_b28957859913.slice/crio-391d1ed1676cd0c248c9e0e7e049c09a7320c14c204754d04c2461d13d8e3ada WatchSource:0}: Error finding container 391d1ed1676cd0c248c9e0e7e049c09a7320c14c204754d04c2461d13d8e3ada: Status 404 returned error can't find the container with id 391d1ed1676cd0c248c9e0e7e049c09a7320c14c204754d04c2461d13d8e3ada Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.734429 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-r6ljq"] Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.747652 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.747732 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.747879 4885 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.747933 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs podName:acaad339-be87-48ab-aee8-7f4637190768 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:12.747916585 +0000 UTC m=+938.044732246 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-b47j2" (UID: "acaad339-be87-48ab-aee8-7f4637190768") : secret "webhook-server-cert" not found Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.748355 4885 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.748388 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs podName:acaad339-be87-48ab-aee8-7f4637190768 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:12.74837945 +0000 UTC m=+938.045195111 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-b47j2" (UID: "acaad339-be87-48ab-aee8-7f4637190768") : secret "metrics-server-cert" not found Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.758576 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-gwtxz"] Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.764626 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hkw2j"] Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.848422 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert\") pod \"infra-operator-controller-manager-57548d458d-dpqcg\" (UID: \"f9775930-6d69-4ad4-a249-f5d2f270b365\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.848616 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.848721 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert podName:f9775930-6d69-4ad4-a249-f5d2f270b365 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:13.848698968 +0000 UTC m=+939.145514629 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert") pod "infra-operator-controller-manager-57548d458d-dpqcg" (UID: "f9775930-6d69-4ad4-a249-f5d2f270b365") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.879357 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-4vb99"] Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.914063 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk"] Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.930161 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kl5ln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-t4mch_openstack-operators(06e1a4eb-c6cb-4146-b2f9-484c2e699a7e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.932057 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kl5ln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-t4mch_openstack-operators(06e1a4eb-c6cb-4146-b2f9-484c2e699a7e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.933893 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch" podUID="06e1a4eb-c6cb-4146-b2f9-484c2e699a7e" Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.943386 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qc8jl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-z4wtk_openstack-operators(33f07e6f-9ac8-461d-b455-ad634c2e255c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.952328 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qc8jl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-z4wtk_openstack-operators(33f07e6f-9ac8-461d-b455-ad634c2e255c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.953424 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk" podUID="33f07e6f-9ac8-461d-b455-ad634c2e255c" Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.959381 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch"] Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.972377 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-t4xtt"] Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.982112 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p"] Dec 05 20:21:11 crc kubenswrapper[4885]: W1205 20:21:11.987854 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc20bdf47_2333_40eb_b5e1_4ad4ad32cdd5.slice/crio-300e22f0fc4402e442539abfa742954bd8c725190fd92253f229714172344498 WatchSource:0}: Error finding container 300e22f0fc4402e442539abfa742954bd8c725190fd92253f229714172344498: Status 404 returned error can't find the container with id 300e22f0fc4402e442539abfa742954bd8c725190fd92253f229714172344498 Dec 05 20:21:11 crc kubenswrapper[4885]: W1205 20:21:11.989136 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf68526b5_c6b6_484e_b476_1e4c76ba71fd.slice/crio-96966561d5a5396cd4ae53b58f91969f72ef680e37eb1940132153168a191f6d WatchSource:0}: Error finding container 96966561d5a5396cd4ae53b58f91969f72ef680e37eb1940132153168a191f6d: Status 404 returned error can't find the container with id 96966561d5a5396cd4ae53b58f91969f72ef680e37eb1940132153168a191f6d Dec 05 20:21:11 crc kubenswrapper[4885]: I1205 20:21:11.990041 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd"] Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.992926 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w8ftv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-rqs2p_openstack-operators(f68526b5-c6b6-484e-b476-1e4c76ba71fd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.994672 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w8ftv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-rqs2p_openstack-operators(f68526b5-c6b6-484e-b476-1e4c76ba71fd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:11 crc kubenswrapper[4885]: W1205 20:21:11.995458 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eea8037_d11c_47ee_9bc9_67deafc20268.slice/crio-ee058cc67757a6b7235a079975846e78979dea1f387f27031510d2056898067b WatchSource:0}: Error finding container ee058cc67757a6b7235a079975846e78979dea1f387f27031510d2056898067b: Status 404 returned error can't find the container with id ee058cc67757a6b7235a079975846e78979dea1f387f27031510d2056898067b Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.995859 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p" podUID="f68526b5-c6b6-484e-b476-1e4c76ba71fd" Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.996788 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2g6z6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-4q2vd_openstack-operators(2eea8037-d11c-47ee-9bc9-67deafc20268): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.998301 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2g6z6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-4q2vd_openstack-operators(2eea8037-d11c-47ee-9bc9-67deafc20268): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:11 crc kubenswrapper[4885]: E1205 20:21:11.999957 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd" podUID="2eea8037-d11c-47ee-9bc9-67deafc20268" Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.055172 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-565xh"] Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.060158 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv"] Dec 05 20:21:12 crc kubenswrapper[4885]: W1205 20:21:12.066976 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9ccfa3f_a548_4e32_9318_b3f2cb19ccca.slice/crio-1f6aaf90dbabcef24a0cba61f9465e842f1d83c0301d5a991dadcda7420d8e54 WatchSource:0}: Error finding container 1f6aaf90dbabcef24a0cba61f9465e842f1d83c0301d5a991dadcda7420d8e54: Status 404 returned error can't find the container with id 1f6aaf90dbabcef24a0cba61f9465e842f1d83c0301d5a991dadcda7420d8e54 Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.072142 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-97hls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-nrtkv_openstack-operators(f9ccfa3f-a548-4e32-9318-b3f2cb19ccca): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.079457 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-97hls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-nrtkv_openstack-operators(f9ccfa3f-a548-4e32-9318-b3f2cb19ccca): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.084225 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv" podUID="f9ccfa3f-a548-4e32-9318-b3f2cb19ccca" Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.084324 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rpkq4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-565xh_openstack-operators(49b39782-af0e-4f86-89f4-96582b6a8336): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.095845 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rpkq4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-565xh_openstack-operators(49b39782-af0e-4f86-89f4-96582b6a8336): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.097569 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" podUID="49b39782-af0e-4f86-89f4-96582b6a8336" Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.125635 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qpp7t"] Dec 05 20:21:12 crc kubenswrapper[4885]: W1205 20:21:12.150326 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18cedf03_5e88_4513_b2cc_e364e749f219.slice/crio-c70b065b10a9b1409edbb7d2a4292f172ec28488efaf448928afca338bb20fc4 WatchSource:0}: Error finding container c70b065b10a9b1409edbb7d2a4292f172ec28488efaf448928afca338bb20fc4: Status 404 returned error can't find the container with id c70b065b10a9b1409edbb7d2a4292f172ec28488efaf448928afca338bb20fc4 Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.153007 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9m48w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-qpp7t_openstack-operators(18cedf03-5e88-4513-b2cc-e364e749f219): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.161487 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qpp7t" podUID="18cedf03-5e88-4513-b2cc-e364e749f219" Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.289150 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-zz7df" event={"ID":"ee66e99c-4761-43a5-a55c-b28957859913","Type":"ContainerStarted","Data":"391d1ed1676cd0c248c9e0e7e049c09a7320c14c204754d04c2461d13d8e3ada"} Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.291161 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch" event={"ID":"06e1a4eb-c6cb-4146-b2f9-484c2e699a7e","Type":"ContainerStarted","Data":"06fc0b7dd34fb8c579e4534c773d030d5ae065dc7bfe4e830d179d6a77a1c3d7"} Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.293220 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-nqshj" event={"ID":"6a0f526a-c496-478e-bc4c-e6478ebeb3ea","Type":"ContainerStarted","Data":"07b3d18ac66b42af1bcd04e8d13e2a3b5bc70d6d341b0c0433b9498fe005184a"} Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.322361 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch" podUID="06e1a4eb-c6cb-4146-b2f9-484c2e699a7e" Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.323514 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-s4ftd" event={"ID":"93741f1b-6823-4374-927f-38d95ba139f5","Type":"ContainerStarted","Data":"f34188911fa76bd7214645ccc3ada96b943650bc4be8bcebee484e57c2fda636"} Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.324802 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qpp7t" event={"ID":"18cedf03-5e88-4513-b2cc-e364e749f219","Type":"ContainerStarted","Data":"c70b065b10a9b1409edbb7d2a4292f172ec28488efaf448928afca338bb20fc4"} Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.326122 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qpp7t" podUID="18cedf03-5e88-4513-b2cc-e364e749f219" Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.327654 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-rqh2l" event={"ID":"c942221f-6ad2-4109-9975-ec8054686283","Type":"ContainerStarted","Data":"420918b350c7e1c2d2a99924daa02aeb2c23ddd605ff5cb08d02a12a0512955b"} Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.332392 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hkw2j" event={"ID":"e12a10c6-f52c-4348-bb54-356af7632dd4","Type":"ContainerStarted","Data":"d71a52ff2d3c7b1d4730a7af92c2eb0d65acf68ea3dfe300d90345281217f3a0"} Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.340433 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m" event={"ID":"3e2eaf31-e16e-4072-ae6b-a5c9eda46732","Type":"ContainerStarted","Data":"f1eba5aa3507be8a8d770700d9581598dbee7ee7f80c78170428c3e4bc99bc38"} Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.342170 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv" event={"ID":"f9ccfa3f-a548-4e32-9318-b3f2cb19ccca","Type":"ContainerStarted","Data":"1f6aaf90dbabcef24a0cba61f9465e842f1d83c0301d5a991dadcda7420d8e54"} Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.346809 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-t4xtt" event={"ID":"c20bdf47-2333-40eb-b5e1-4ad4ad32cdd5","Type":"ContainerStarted","Data":"300e22f0fc4402e442539abfa742954bd8c725190fd92253f229714172344498"} Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.353142 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-z27c2" event={"ID":"741c1713-f931-471e-ad95-99d16600ab76","Type":"ContainerStarted","Data":"79c289d6abdb704cace0c640d8a61f00362160a39955baf8c739bf5f6c09c75b"} Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.355645 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk" event={"ID":"33f07e6f-9ac8-461d-b455-ad634c2e255c","Type":"ContainerStarted","Data":"e3f22a79667f461d7374a4977afb3b74319fb18936f98bfc3d69d0def55347fd"} Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.364493 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kgdg2" event={"ID":"9034e951-dbbb-4927-b9fa-fa2e83c1595c","Type":"ContainerStarted","Data":"32c9768904de1ced7fbe3a0715e7056ced382e753cbdeb22fd2a0ba24105091b"} Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.371956 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv" podUID="f9ccfa3f-a548-4e32-9318-b3f2cb19ccca" Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.383066 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-cqj46" event={"ID":"74869c39-a4c4-4812-8656-4751d25ef987","Type":"ContainerStarted","Data":"02845b6e67b3fce31876c7e9e745508f8abd7ed953f7ff607a7636ba141ba768"} Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.389234 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk" podUID="33f07e6f-9ac8-461d-b455-ad634c2e255c" Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.396889 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gwtxz" event={"ID":"aed37ead-6406-43f0-a6f5-4e8864935a58","Type":"ContainerStarted","Data":"80ffc02beb7dcb26a33938ba58ed5ec0fd5f4db2dd987b18eb2fece2230c8c9a"} Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.400359 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-r6ljq" event={"ID":"da47cf7f-37ab-4d5d-99b1-1b312002f83e","Type":"ContainerStarted","Data":"385d590b29ca1f5da1e237590a30cd747d646f27dbd989adf2f272e3466220ab"} Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.401794 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd" event={"ID":"2eea8037-d11c-47ee-9bc9-67deafc20268","Type":"ContainerStarted","Data":"ee058cc67757a6b7235a079975846e78979dea1f387f27031510d2056898067b"} Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.404218 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p" event={"ID":"f68526b5-c6b6-484e-b476-1e4c76ba71fd","Type":"ContainerStarted","Data":"96966561d5a5396cd4ae53b58f91969f72ef680e37eb1940132153168a191f6d"} Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.405957 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p" podUID="f68526b5-c6b6-484e-b476-1e4c76ba71fd" Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.407358 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" event={"ID":"49b39782-af0e-4f86-89f4-96582b6a8336","Type":"ContainerStarted","Data":"279f4d053f985a132f0afe35e854db735aed5e3b55502be2008901eee1ebbb04"} Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.410242 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd" podUID="2eea8037-d11c-47ee-9bc9-67deafc20268" Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.410793 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" podUID="49b39782-af0e-4f86-89f4-96582b6a8336" Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.425139 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4vb99" event={"ID":"ca2be922-afb3-4640-bdad-cfd3b0164d52","Type":"ContainerStarted","Data":"e788f8ee5578e6ede73967694df0e9ac06bb4d79b1db91c771822009c484a1d4"} Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.484754 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5qfqlm\" (UID: \"fdb3c987-9d79-4920-9b95-1be3a3dbc622\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.485265 4885 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.485310 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert podName:fdb3c987-9d79-4920-9b95-1be3a3dbc622 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:14.485294901 +0000 UTC m=+939.782110562 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" (UID: "fdb3c987-9d79-4920-9b95-1be3a3dbc622") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.802080 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:12 crc kubenswrapper[4885]: I1205 20:21:12.802173 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.802177 4885 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.802263 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs podName:acaad339-be87-48ab-aee8-7f4637190768 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:14.802239084 +0000 UTC m=+940.099054825 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-b47j2" (UID: "acaad339-be87-48ab-aee8-7f4637190768") : secret "metrics-server-cert" not found Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.803626 4885 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:21:12 crc kubenswrapper[4885]: E1205 20:21:12.803691 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs podName:acaad339-be87-48ab-aee8-7f4637190768 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:14.80367215 +0000 UTC m=+940.100487871 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-b47j2" (UID: "acaad339-be87-48ab-aee8-7f4637190768") : secret "webhook-server-cert" not found Dec 05 20:21:13 crc kubenswrapper[4885]: E1205 20:21:13.435740 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qpp7t" podUID="18cedf03-5e88-4513-b2cc-e364e749f219" Dec 05 20:21:13 crc kubenswrapper[4885]: E1205 20:21:13.441332 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk" podUID="33f07e6f-9ac8-461d-b455-ad634c2e255c" Dec 05 20:21:13 crc kubenswrapper[4885]: E1205 20:21:13.442437 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" podUID="49b39782-af0e-4f86-89f4-96582b6a8336" Dec 05 20:21:13 crc kubenswrapper[4885]: E1205 20:21:13.442552 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv" podUID="f9ccfa3f-a548-4e32-9318-b3f2cb19ccca" Dec 05 20:21:13 crc kubenswrapper[4885]: E1205 20:21:13.442902 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd" podUID="2eea8037-d11c-47ee-9bc9-67deafc20268" Dec 05 20:21:13 crc kubenswrapper[4885]: E1205 20:21:13.444617 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch" podUID="06e1a4eb-c6cb-4146-b2f9-484c2e699a7e" Dec 05 20:21:13 crc kubenswrapper[4885]: E1205 20:21:13.448343 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p" podUID="f68526b5-c6b6-484e-b476-1e4c76ba71fd" Dec 05 20:21:13 crc kubenswrapper[4885]: I1205 20:21:13.936881 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert\") pod \"infra-operator-controller-manager-57548d458d-dpqcg\" (UID: \"f9775930-6d69-4ad4-a249-f5d2f270b365\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" Dec 05 20:21:13 crc kubenswrapper[4885]: E1205 20:21:13.937059 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:21:13 crc kubenswrapper[4885]: E1205 20:21:13.937104 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert podName:f9775930-6d69-4ad4-a249-f5d2f270b365 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:17.937089323 +0000 UTC m=+943.233904984 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert") pod "infra-operator-controller-manager-57548d458d-dpqcg" (UID: "f9775930-6d69-4ad4-a249-f5d2f270b365") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:21:14 crc kubenswrapper[4885]: I1205 20:21:14.545517 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5qfqlm\" (UID: \"fdb3c987-9d79-4920-9b95-1be3a3dbc622\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" Dec 05 20:21:14 crc kubenswrapper[4885]: E1205 20:21:14.545734 4885 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:21:14 crc kubenswrapper[4885]: E1205 20:21:14.545886 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert podName:fdb3c987-9d79-4920-9b95-1be3a3dbc622 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:18.545869521 +0000 UTC m=+943.842685182 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" (UID: "fdb3c987-9d79-4920-9b95-1be3a3dbc622") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:21:14 crc kubenswrapper[4885]: I1205 20:21:14.850055 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:14 crc kubenswrapper[4885]: I1205 20:21:14.850116 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:14 crc kubenswrapper[4885]: E1205 20:21:14.850255 4885 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:21:14 crc kubenswrapper[4885]: E1205 20:21:14.850303 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs podName:acaad339-be87-48ab-aee8-7f4637190768 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:18.850287637 +0000 UTC m=+944.147103298 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-b47j2" (UID: "acaad339-be87-48ab-aee8-7f4637190768") : secret "webhook-server-cert" not found Dec 05 20:21:14 crc kubenswrapper[4885]: E1205 20:21:14.850599 4885 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:21:14 crc kubenswrapper[4885]: E1205 20:21:14.850623 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs podName:acaad339-be87-48ab-aee8-7f4637190768 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:18.850616618 +0000 UTC m=+944.147432279 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-b47j2" (UID: "acaad339-be87-48ab-aee8-7f4637190768") : secret "metrics-server-cert" not found Dec 05 20:21:16 crc kubenswrapper[4885]: I1205 20:21:16.631318 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:21:16 crc kubenswrapper[4885]: I1205 20:21:16.631385 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.028593 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert\") pod \"infra-operator-controller-manager-57548d458d-dpqcg\" (UID: \"f9775930-6d69-4ad4-a249-f5d2f270b365\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" Dec 05 20:21:18 crc kubenswrapper[4885]: E1205 20:21:18.029111 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:21:18 crc kubenswrapper[4885]: E1205 20:21:18.029166 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert podName:f9775930-6d69-4ad4-a249-f5d2f270b365 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:26.02914801 +0000 UTC m=+951.325963671 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert") pod "infra-operator-controller-manager-57548d458d-dpqcg" (UID: "f9775930-6d69-4ad4-a249-f5d2f270b365") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.448876 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cg9xw"] Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.451156 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.464805 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg9xw"] Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.534194 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-utilities\") pod \"redhat-marketplace-cg9xw\" (UID: \"6f305d3b-1f60-41d0-9d30-ffd33e6a612a\") " pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.534261 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-catalog-content\") pod \"redhat-marketplace-cg9xw\" (UID: \"6f305d3b-1f60-41d0-9d30-ffd33e6a612a\") " pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.534307 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqtmr\" (UniqueName: \"kubernetes.io/projected/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-kube-api-access-fqtmr\") pod \"redhat-marketplace-cg9xw\" (UID: \"6f305d3b-1f60-41d0-9d30-ffd33e6a612a\") " pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.635692 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5qfqlm\" (UID: \"fdb3c987-9d79-4920-9b95-1be3a3dbc622\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.635795 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-utilities\") pod \"redhat-marketplace-cg9xw\" (UID: \"6f305d3b-1f60-41d0-9d30-ffd33e6a612a\") " pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.635846 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-catalog-content\") pod \"redhat-marketplace-cg9xw\" (UID: \"6f305d3b-1f60-41d0-9d30-ffd33e6a612a\") " pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.635889 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqtmr\" (UniqueName: \"kubernetes.io/projected/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-kube-api-access-fqtmr\") pod \"redhat-marketplace-cg9xw\" (UID: \"6f305d3b-1f60-41d0-9d30-ffd33e6a612a\") " pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:18 crc kubenswrapper[4885]: E1205 20:21:18.635902 4885 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:21:18 crc kubenswrapper[4885]: E1205 20:21:18.635974 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert podName:fdb3c987-9d79-4920-9b95-1be3a3dbc622 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:26.635953806 +0000 UTC m=+951.932769467 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" (UID: "fdb3c987-9d79-4920-9b95-1be3a3dbc622") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.636708 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-catalog-content\") pod \"redhat-marketplace-cg9xw\" (UID: \"6f305d3b-1f60-41d0-9d30-ffd33e6a612a\") " pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.636742 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-utilities\") pod \"redhat-marketplace-cg9xw\" (UID: \"6f305d3b-1f60-41d0-9d30-ffd33e6a612a\") " pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.659819 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqtmr\" (UniqueName: \"kubernetes.io/projected/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-kube-api-access-fqtmr\") pod \"redhat-marketplace-cg9xw\" (UID: \"6f305d3b-1f60-41d0-9d30-ffd33e6a612a\") " pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.838248 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.940176 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:18 crc kubenswrapper[4885]: I1205 20:21:18.940240 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:18 crc kubenswrapper[4885]: E1205 20:21:18.940373 4885 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:21:18 crc kubenswrapper[4885]: E1205 20:21:18.940414 4885 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:21:18 crc kubenswrapper[4885]: E1205 20:21:18.940437 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs podName:acaad339-be87-48ab-aee8-7f4637190768 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:26.940419713 +0000 UTC m=+952.237235374 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-b47j2" (UID: "acaad339-be87-48ab-aee8-7f4637190768") : secret "webhook-server-cert" not found Dec 05 20:21:18 crc kubenswrapper[4885]: E1205 20:21:18.940543 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs podName:acaad339-be87-48ab-aee8-7f4637190768 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:26.940520266 +0000 UTC m=+952.237335997 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-b47j2" (UID: "acaad339-be87-48ab-aee8-7f4637190768") : secret "metrics-server-cert" not found Dec 05 20:21:25 crc kubenswrapper[4885]: E1205 20:21:25.897284 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 05 20:21:25 crc kubenswrapper[4885]: E1205 20:21:25.898789 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7dgcl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-r6ljq_openstack-operators(da47cf7f-37ab-4d5d-99b1-1b312002f83e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.070925 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert\") pod \"infra-operator-controller-manager-57548d458d-dpqcg\" (UID: \"f9775930-6d69-4ad4-a249-f5d2f270b365\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.082424 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9775930-6d69-4ad4-a249-f5d2f270b365-cert\") pod \"infra-operator-controller-manager-57548d458d-dpqcg\" (UID: \"f9775930-6d69-4ad4-a249-f5d2f270b365\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.282122 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.353816 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg9xw"] Dec 05 20:21:26 crc kubenswrapper[4885]: E1205 20:21:26.488115 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l79g4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-w5c5m_openstack-operators(3e2eaf31-e16e-4072-ae6b-a5c9eda46732): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:26 crc kubenswrapper[4885]: E1205 20:21:26.489471 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m" podUID="3e2eaf31-e16e-4072-ae6b-a5c9eda46732" Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.616654 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-s4ftd" event={"ID":"93741f1b-6823-4374-927f-38d95ba139f5","Type":"ContainerStarted","Data":"d08d820da56e5f7ab3249bb096a6ebcda646d83369d7b8d62805f0c5a9d0b684"} Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.627775 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-t4xtt" event={"ID":"c20bdf47-2333-40eb-b5e1-4ad4ad32cdd5","Type":"ContainerStarted","Data":"05f06ed04faacd316bb379209ed01d97c1ae3c7fdadcb44354bd1003d5662ec4"} Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.628931 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-zz7df" event={"ID":"ee66e99c-4761-43a5-a55c-b28957859913","Type":"ContainerStarted","Data":"0bcfdcdd51a8d10ff982ab4eb2ee48db37f9ae924c6dbf69a4cb54399281a203"} Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.635148 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-rqh2l" event={"ID":"c942221f-6ad2-4109-9975-ec8054686283","Type":"ContainerStarted","Data":"506ac0a6f273f659d036314aeec72e10e6109fe582ebf9c6c86d15f07c3986ce"} Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.661284 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hkw2j" event={"ID":"e12a10c6-f52c-4348-bb54-356af7632dd4","Type":"ContainerStarted","Data":"ef5f3d2a95fcefc972a644034110a0326010cf941eb336158186ef3d3586be31"} Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.668801 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m" event={"ID":"3e2eaf31-e16e-4072-ae6b-a5c9eda46732","Type":"ContainerStarted","Data":"8d63b44caab9fed66d842e759363d32d30880af4b1ea6e63d168cccddc4fe7fb"} Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.668862 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m" Dec 05 20:21:26 crc kubenswrapper[4885]: E1205 20:21:26.670215 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m" podUID="3e2eaf31-e16e-4072-ae6b-a5c9eda46732" Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.671315 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-cqj46" event={"ID":"74869c39-a4c4-4812-8656-4751d25ef987","Type":"ContainerStarted","Data":"8bcc5ad7262c2b76126d219de289e0fb5732fd29873a129932c3dbbf459235b1"} Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.672294 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-nqshj" event={"ID":"6a0f526a-c496-478e-bc4c-e6478ebeb3ea","Type":"ContainerStarted","Data":"b3685a3bf8d4d2a8aade5574986868e5464926e95a295ae2ce3ac355828128a9"} Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.680848 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5qfqlm\" (UID: \"fdb3c987-9d79-4920-9b95-1be3a3dbc622\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" Dec 05 20:21:26 crc kubenswrapper[4885]: E1205 20:21:26.681059 4885 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:21:26 crc kubenswrapper[4885]: E1205 20:21:26.681118 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert podName:fdb3c987-9d79-4920-9b95-1be3a3dbc622 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:42.681099453 +0000 UTC m=+967.977915114 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" (UID: "fdb3c987-9d79-4920-9b95-1be3a3dbc622") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.693785 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gwtxz" event={"ID":"aed37ead-6406-43f0-a6f5-4e8864935a58","Type":"ContainerStarted","Data":"ec1c59cae293e4f289120980072a6027f4412531f9b3644e3b1f26389ca1a557"} Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.696269 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg9xw" event={"ID":"6f305d3b-1f60-41d0-9d30-ffd33e6a612a","Type":"ContainerStarted","Data":"bcdfbde070bf905231d1559ffada506fac4ec098e04648b1174ebad332c7dca8"} Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.713607 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-z27c2" event={"ID":"741c1713-f931-471e-ad95-99d16600ab76","Type":"ContainerStarted","Data":"0049428d9454896a801d9e3067afc2a1b385c705d7efae193e2c2d00231a8f8a"} Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.725386 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4vb99" event={"ID":"ca2be922-afb3-4640-bdad-cfd3b0164d52","Type":"ContainerStarted","Data":"280b0c84073a4bb28e85797d3d3266ddd9aa7a0111c3cd68f31c0a82aa0563e9"} Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.732264 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kgdg2" event={"ID":"9034e951-dbbb-4927-b9fa-fa2e83c1595c","Type":"ContainerStarted","Data":"05caff693a5c087550b709b0f36b93cb31706220642025a6f8459302a75b5a07"} Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.985370 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:26 crc kubenswrapper[4885]: I1205 20:21:26.985693 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:26 crc kubenswrapper[4885]: E1205 20:21:26.985869 4885 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:21:26 crc kubenswrapper[4885]: E1205 20:21:26.985907 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs podName:acaad339-be87-48ab-aee8-7f4637190768 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:42.985894391 +0000 UTC m=+968.282710052 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-b47j2" (UID: "acaad339-be87-48ab-aee8-7f4637190768") : secret "webhook-server-cert" not found Dec 05 20:21:26 crc kubenswrapper[4885]: E1205 20:21:26.986379 4885 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:21:26 crc kubenswrapper[4885]: E1205 20:21:26.986406 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs podName:acaad339-be87-48ab-aee8-7f4637190768 nodeName:}" failed. No retries permitted until 2025-12-05 20:21:42.986398017 +0000 UTC m=+968.283213678 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-b47j2" (UID: "acaad339-be87-48ab-aee8-7f4637190768") : secret "metrics-server-cert" not found Dec 05 20:21:27 crc kubenswrapper[4885]: I1205 20:21:27.006337 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg"] Dec 05 20:21:27 crc kubenswrapper[4885]: E1205 20:21:27.177087 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9m48w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-qpp7t_openstack-operators(18cedf03-5e88-4513-b2cc-e364e749f219): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:27 crc kubenswrapper[4885]: E1205 20:21:27.177215 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rpkq4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-565xh_openstack-operators(49b39782-af0e-4f86-89f4-96582b6a8336): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:27 crc kubenswrapper[4885]: E1205 20:21:27.183308 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qpp7t" podUID="18cedf03-5e88-4513-b2cc-e364e749f219" Dec 05 20:21:27 crc kubenswrapper[4885]: E1205 20:21:27.188424 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rpkq4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-565xh_openstack-operators(49b39782-af0e-4f86-89f4-96582b6a8336): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:21:27 crc kubenswrapper[4885]: E1205 20:21:27.190149 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" podUID="49b39782-af0e-4f86-89f4-96582b6a8336" Dec 05 20:21:27 crc kubenswrapper[4885]: W1205 20:21:27.349241 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9775930_6d69_4ad4_a249_f5d2f270b365.slice/crio-0e04e10be55d7d285269cad23c5d8ab6ba4af8683762824df61bfa4acc1d7dc9 WatchSource:0}: Error finding container 0e04e10be55d7d285269cad23c5d8ab6ba4af8683762824df61bfa4acc1d7dc9: Status 404 returned error can't find the container with id 0e04e10be55d7d285269cad23c5d8ab6ba4af8683762824df61bfa4acc1d7dc9 Dec 05 20:21:27 crc kubenswrapper[4885]: I1205 20:21:27.766056 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" event={"ID":"f9775930-6d69-4ad4-a249-f5d2f270b365","Type":"ContainerStarted","Data":"0e04e10be55d7d285269cad23c5d8ab6ba4af8683762824df61bfa4acc1d7dc9"} Dec 05 20:21:27 crc kubenswrapper[4885]: I1205 20:21:27.792844 4885 generic.go:334] "Generic (PLEG): container finished" podID="6f305d3b-1f60-41d0-9d30-ffd33e6a612a" containerID="5194a983df22965f14c238d2ef4abadffbe3166cbfd31262587392923871399d" exitCode=0 Dec 05 20:21:27 crc kubenswrapper[4885]: I1205 20:21:27.792910 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg9xw" event={"ID":"6f305d3b-1f60-41d0-9d30-ffd33e6a612a","Type":"ContainerDied","Data":"5194a983df22965f14c238d2ef4abadffbe3166cbfd31262587392923871399d"} Dec 05 20:21:27 crc kubenswrapper[4885]: E1205 20:21:27.795375 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m" podUID="3e2eaf31-e16e-4072-ae6b-a5c9eda46732" Dec 05 20:21:32 crc kubenswrapper[4885]: I1205 20:21:32.910616 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pqjfv"] Dec 05 20:21:32 crc kubenswrapper[4885]: I1205 20:21:32.913068 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:21:32 crc kubenswrapper[4885]: I1205 20:21:32.934702 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pqjfv"] Dec 05 20:21:33 crc kubenswrapper[4885]: I1205 20:21:33.015552 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-catalog-content\") pod \"community-operators-pqjfv\" (UID: \"3ccb9d8c-dba1-494f-9d52-1b607aba77f2\") " pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:21:33 crc kubenswrapper[4885]: I1205 20:21:33.015894 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-utilities\") pod \"community-operators-pqjfv\" (UID: \"3ccb9d8c-dba1-494f-9d52-1b607aba77f2\") " pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:21:33 crc kubenswrapper[4885]: I1205 20:21:33.015977 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knrtq\" (UniqueName: \"kubernetes.io/projected/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-kube-api-access-knrtq\") pod \"community-operators-pqjfv\" (UID: \"3ccb9d8c-dba1-494f-9d52-1b607aba77f2\") " pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:21:33 crc kubenswrapper[4885]: I1205 20:21:33.117451 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-catalog-content\") pod \"community-operators-pqjfv\" (UID: \"3ccb9d8c-dba1-494f-9d52-1b607aba77f2\") " pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:21:33 crc kubenswrapper[4885]: I1205 20:21:33.117591 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-utilities\") pod \"community-operators-pqjfv\" (UID: \"3ccb9d8c-dba1-494f-9d52-1b607aba77f2\") " pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:21:33 crc kubenswrapper[4885]: I1205 20:21:33.117617 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knrtq\" (UniqueName: \"kubernetes.io/projected/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-kube-api-access-knrtq\") pod \"community-operators-pqjfv\" (UID: \"3ccb9d8c-dba1-494f-9d52-1b607aba77f2\") " pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:21:33 crc kubenswrapper[4885]: I1205 20:21:33.117930 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-catalog-content\") pod \"community-operators-pqjfv\" (UID: \"3ccb9d8c-dba1-494f-9d52-1b607aba77f2\") " pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:21:33 crc kubenswrapper[4885]: I1205 20:21:33.118302 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-utilities\") pod \"community-operators-pqjfv\" (UID: \"3ccb9d8c-dba1-494f-9d52-1b607aba77f2\") " pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:21:33 crc kubenswrapper[4885]: I1205 20:21:33.136614 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knrtq\" (UniqueName: \"kubernetes.io/projected/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-kube-api-access-knrtq\") pod \"community-operators-pqjfv\" (UID: \"3ccb9d8c-dba1-494f-9d52-1b607aba77f2\") " pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:21:33 crc kubenswrapper[4885]: I1205 20:21:33.237353 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:21:34 crc kubenswrapper[4885]: I1205 20:21:34.441400 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pqjfv"] Dec 05 20:21:34 crc kubenswrapper[4885]: W1205 20:21:34.519164 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ccb9d8c_dba1_494f_9d52_1b607aba77f2.slice/crio-896465794d6cb28cee59dcb53af6a1d982ea5ed769342706adae912006507e87 WatchSource:0}: Error finding container 896465794d6cb28cee59dcb53af6a1d982ea5ed769342706adae912006507e87: Status 404 returned error can't find the container with id 896465794d6cb28cee59dcb53af6a1d982ea5ed769342706adae912006507e87 Dec 05 20:21:34 crc kubenswrapper[4885]: I1205 20:21:34.841000 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqjfv" event={"ID":"3ccb9d8c-dba1-494f-9d52-1b607aba77f2","Type":"ContainerStarted","Data":"896465794d6cb28cee59dcb53af6a1d982ea5ed769342706adae912006507e87"} Dec 05 20:21:38 crc kubenswrapper[4885]: E1205 20:21:38.198936 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-r6ljq" podUID="da47cf7f-37ab-4d5d-99b1-1b312002f83e" Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.883222 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd" event={"ID":"2eea8037-d11c-47ee-9bc9-67deafc20268","Type":"ContainerStarted","Data":"1a7e49a80396a2453de735a1a031a92e10d48d739aad228a24928f84e53c5d0d"} Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.883545 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd" event={"ID":"2eea8037-d11c-47ee-9bc9-67deafc20268","Type":"ContainerStarted","Data":"e4ae14fb3bf1e2b600cd2977a65dd9ab2d79411564e1699751fa78bcfddac8ef"} Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.884391 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd" Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.891396 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-z27c2" event={"ID":"741c1713-f931-471e-ad95-99d16600ab76","Type":"ContainerStarted","Data":"a8351bb823991a6071222e253527ff51dc9ddfdc64b9a94932e4c32963de2937"} Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.891736 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-z27c2" Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.893944 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch" event={"ID":"06e1a4eb-c6cb-4146-b2f9-484c2e699a7e","Type":"ContainerStarted","Data":"5cf66abc4a69b0e7746385ceabda485fa859062c4c15adf40ff13d7b521c43d7"} Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.898460 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-z27c2" Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.906773 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kgdg2" event={"ID":"9034e951-dbbb-4927-b9fa-fa2e83c1595c","Type":"ContainerStarted","Data":"aa2f4ec262cf6c34829a95ea1a7a955cbd1631093cb0d87d5fa35634621dd1db"} Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.906828 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kgdg2" Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.911005 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kgdg2" Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.913442 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-cqj46" event={"ID":"74869c39-a4c4-4812-8656-4751d25ef987","Type":"ContainerStarted","Data":"b5dc040c816a4029e24c9d82e062426661ac1f6e871b5818153da5e470d731ec"} Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.914273 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-cqj46" Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.920740 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-cqj46" Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.924562 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd" podStartSLOduration=3.179055061 podStartE2EDuration="28.924548643s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.996713833 +0000 UTC m=+937.293529494" lastFinishedPulling="2025-12-05 20:21:37.742207415 +0000 UTC m=+963.039023076" observedRunningTime="2025-12-05 20:21:38.924156391 +0000 UTC m=+964.220972052" watchObservedRunningTime="2025-12-05 20:21:38.924548643 +0000 UTC m=+964.221364304" Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.938361 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-s4ftd" event={"ID":"93741f1b-6823-4374-927f-38d95ba139f5","Type":"ContainerStarted","Data":"6fa909f0d695cbc4b51c5e849b28dfbec540622d7d09f294fcae05a61a75a3d5"} Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.939176 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-s4ftd" Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.945545 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-s4ftd" Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.946458 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv" event={"ID":"f9ccfa3f-a548-4e32-9318-b3f2cb19ccca","Type":"ContainerStarted","Data":"63dd0aa3b7e3849e9294cc14a1878ba71ed56190edf981f03bf4fc394805b543"} Dec 05 20:21:38 crc kubenswrapper[4885]: I1205 20:21:38.979691 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-r6ljq" event={"ID":"da47cf7f-37ab-4d5d-99b1-1b312002f83e","Type":"ContainerStarted","Data":"8d36702e7d8a1a76368b4b5f58613936454bcd52b334bd1c476236476ee9d65b"} Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:38.996136 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-cqj46" podStartSLOduration=3.780474293 podStartE2EDuration="29.996115968s" podCreationTimestamp="2025-12-05 20:21:09 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.651611654 +0000 UTC m=+936.948427315" lastFinishedPulling="2025-12-05 20:21:37.867253329 +0000 UTC m=+963.164068990" observedRunningTime="2025-12-05 20:21:38.994325341 +0000 UTC m=+964.291141012" watchObservedRunningTime="2025-12-05 20:21:38.996115968 +0000 UTC m=+964.292931629" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.003855 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-zz7df" event={"ID":"ee66e99c-4761-43a5-a55c-b28957859913","Type":"ContainerStarted","Data":"7d0bb54b7dcdd34f43694387fc203dead89fac9933110a81ebf48d6dbd423b8a"} Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.004842 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-zz7df" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.006472 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-z27c2" podStartSLOduration=2.8197528309999997 podStartE2EDuration="29.006454666s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.613616397 +0000 UTC m=+936.910432058" lastFinishedPulling="2025-12-05 20:21:37.800318232 +0000 UTC m=+963.097133893" observedRunningTime="2025-12-05 20:21:38.961528928 +0000 UTC m=+964.258344589" watchObservedRunningTime="2025-12-05 20:21:39.006454666 +0000 UTC m=+964.303270327" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.027778 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-zz7df" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.028706 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kgdg2" podStartSLOduration=3.64354152 podStartE2EDuration="30.028687383s" podCreationTimestamp="2025-12-05 20:21:09 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.391639911 +0000 UTC m=+936.688455572" lastFinishedPulling="2025-12-05 20:21:37.776785774 +0000 UTC m=+963.073601435" observedRunningTime="2025-12-05 20:21:39.02732636 +0000 UTC m=+964.324142021" watchObservedRunningTime="2025-12-05 20:21:39.028687383 +0000 UTC m=+964.325503044" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.051308 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk" event={"ID":"33f07e6f-9ac8-461d-b455-ad634c2e255c","Type":"ContainerStarted","Data":"3ada55f52052099a9fe93392e74f484d25728073598426396fc2c7fa7f96d17d"} Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.051943 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.068239 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-s4ftd" podStartSLOduration=3.571314334 podStartE2EDuration="30.068223589s" podCreationTimestamp="2025-12-05 20:21:09 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.369543669 +0000 UTC m=+936.666359330" lastFinishedPulling="2025-12-05 20:21:37.866452924 +0000 UTC m=+963.163268585" observedRunningTime="2025-12-05 20:21:39.06572926 +0000 UTC m=+964.362544921" watchObservedRunningTime="2025-12-05 20:21:39.068223589 +0000 UTC m=+964.365039250" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.084795 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-rqh2l" event={"ID":"c942221f-6ad2-4109-9975-ec8054686283","Type":"ContainerStarted","Data":"2ee35304c9725fd7f5870b2b41c700aff8b56b6de2f51a1f1a215a1fb8e35b44"} Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.085139 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-rqh2l" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.086844 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-nqshj" event={"ID":"6a0f526a-c496-478e-bc4c-e6478ebeb3ea","Type":"ContainerStarted","Data":"b1092e69cb63991729f76e63227bf0d81368b997129e3d46b6cbc4c0dc8b034f"} Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.091128 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-nqshj" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.104145 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-zz7df" podStartSLOduration=3.025314745 podStartE2EDuration="29.1041233s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.695235291 +0000 UTC m=+936.992050952" lastFinishedPulling="2025-12-05 20:21:37.774043846 +0000 UTC m=+963.070859507" observedRunningTime="2025-12-05 20:21:39.09278432 +0000 UTC m=+964.389599971" watchObservedRunningTime="2025-12-05 20:21:39.1041233 +0000 UTC m=+964.400938951" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.108845 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-rqh2l" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.114176 4885 generic.go:334] "Generic (PLEG): container finished" podID="6f305d3b-1f60-41d0-9d30-ffd33e6a612a" containerID="f271f51414ffe8d9c0202095a14e944e8e37ed4ca088293034210d3862274f34" exitCode=0 Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.114240 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg9xw" event={"ID":"6f305d3b-1f60-41d0-9d30-ffd33e6a612a","Type":"ContainerDied","Data":"f271f51414ffe8d9c0202095a14e944e8e37ed4ca088293034210d3862274f34"} Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.132438 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-nqshj" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.146162 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" event={"ID":"f9775930-6d69-4ad4-a249-f5d2f270b365","Type":"ContainerStarted","Data":"d98d663aba263277ae469ea5fca9575e231b4257fde3fa95ae7b388d407b1c1c"} Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.146741 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.148192 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk" podStartSLOduration=3.400881511 podStartE2EDuration="29.148184501s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.942811569 +0000 UTC m=+937.239627230" lastFinishedPulling="2025-12-05 20:21:37.690114569 +0000 UTC m=+962.986930220" observedRunningTime="2025-12-05 20:21:39.147065555 +0000 UTC m=+964.443881216" watchObservedRunningTime="2025-12-05 20:21:39.148184501 +0000 UTC m=+964.445000162" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.185843 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hkw2j" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.186148 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hkw2j" event={"ID":"e12a10c6-f52c-4348-bb54-356af7632dd4","Type":"ContainerStarted","Data":"d82ef1f3124ab8aa037fd160e8f10be8ba2bccb1d055a0a707bc20b0f812d15e"} Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.186259 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hkw2j" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.187614 4885 generic.go:334] "Generic (PLEG): container finished" podID="3ccb9d8c-dba1-494f-9d52-1b607aba77f2" containerID="faaa14a4276b53835a663c15cdae1677cb8befba539772d4ecdf13baa2f599c1" exitCode=0 Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.187721 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqjfv" event={"ID":"3ccb9d8c-dba1-494f-9d52-1b607aba77f2","Type":"ContainerDied","Data":"faaa14a4276b53835a663c15cdae1677cb8befba539772d4ecdf13baa2f599c1"} Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.199036 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-nqshj" podStartSLOduration=3.741143812 podStartE2EDuration="30.198999245s" podCreationTimestamp="2025-12-05 20:21:09 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.38181934 +0000 UTC m=+936.678635001" lastFinishedPulling="2025-12-05 20:21:37.839674773 +0000 UTC m=+963.136490434" observedRunningTime="2025-12-05 20:21:39.19377203 +0000 UTC m=+964.490587691" watchObservedRunningTime="2025-12-05 20:21:39.198999245 +0000 UTC m=+964.495814906" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.228889 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4vb99" event={"ID":"ca2be922-afb3-4640-bdad-cfd3b0164d52","Type":"ContainerStarted","Data":"c3d770e9f024f517832fc6989ba52df7aa99efa0d92094118a487987acf77bcf"} Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.229618 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4vb99" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.247699 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4vb99" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.255959 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-rqh2l" podStartSLOduration=4.111218504 podStartE2EDuration="30.255933915s" podCreationTimestamp="2025-12-05 20:21:09 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.629690137 +0000 UTC m=+936.926505798" lastFinishedPulling="2025-12-05 20:21:37.774405518 +0000 UTC m=+963.071221209" observedRunningTime="2025-12-05 20:21:39.250180833 +0000 UTC m=+964.546996504" watchObservedRunningTime="2025-12-05 20:21:39.255933915 +0000 UTC m=+964.552749576" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.257849 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p" event={"ID":"f68526b5-c6b6-484e-b476-1e4c76ba71fd","Type":"ContainerStarted","Data":"9da8a33c5f9319cc61dfaa73d993d8e321550d3834c870b735dd1c847d9d917d"} Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.257899 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p" event={"ID":"f68526b5-c6b6-484e-b476-1e4c76ba71fd","Type":"ContainerStarted","Data":"154e898f5faefba4fc136be1815d704c1dec4873fea9a6709833e310e01a9b91"} Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.258185 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.290725 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hkw2j" podStartSLOduration=3.258053272 podStartE2EDuration="29.290700111s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.768238581 +0000 UTC m=+937.065054242" lastFinishedPulling="2025-12-05 20:21:37.80088542 +0000 UTC m=+963.097701081" observedRunningTime="2025-12-05 20:21:39.287701285 +0000 UTC m=+964.584516946" watchObservedRunningTime="2025-12-05 20:21:39.290700111 +0000 UTC m=+964.587515782" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.297260 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gwtxz" event={"ID":"aed37ead-6406-43f0-a6f5-4e8864935a58","Type":"ContainerStarted","Data":"eca664db4999a6d6e57d7e94f97689f817d97345e162ed5efd5132145ea180a6"} Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.298119 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gwtxz" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.332352 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gwtxz" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.333544 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-t4xtt" event={"ID":"c20bdf47-2333-40eb-b5e1-4ad4ad32cdd5","Type":"ContainerStarted","Data":"5605dffc7fd0ece707908aed246bd5b5d6514f3b8e97f55e0493f5fcf59fd0dd"} Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.333927 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-t4xtt" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.354729 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4vb99" podStartSLOduration=3.4626377440000002 podStartE2EDuration="29.354712005s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.884744514 +0000 UTC m=+937.181560175" lastFinishedPulling="2025-12-05 20:21:37.776818735 +0000 UTC m=+963.073634436" observedRunningTime="2025-12-05 20:21:39.353913729 +0000 UTC m=+964.650729400" watchObservedRunningTime="2025-12-05 20:21:39.354712005 +0000 UTC m=+964.651527676" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.366946 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-t4xtt" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.434909 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" podStartSLOduration=19.070993221 podStartE2EDuration="29.434887153s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:27.353642079 +0000 UTC m=+952.650457740" lastFinishedPulling="2025-12-05 20:21:37.717536011 +0000 UTC m=+963.014351672" observedRunningTime="2025-12-05 20:21:39.397118193 +0000 UTC m=+964.693933854" watchObservedRunningTime="2025-12-05 20:21:39.434887153 +0000 UTC m=+964.731702804" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.453557 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p" podStartSLOduration=7.189617276 podStartE2EDuration="29.453540515s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.992835909 +0000 UTC m=+937.289651570" lastFinishedPulling="2025-12-05 20:21:34.256759148 +0000 UTC m=+959.553574809" observedRunningTime="2025-12-05 20:21:39.450587043 +0000 UTC m=+964.747402704" watchObservedRunningTime="2025-12-05 20:21:39.453540515 +0000 UTC m=+964.750356176" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.531322 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gwtxz" podStartSLOduration=3.453064301 podStartE2EDuration="29.531294848s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.775929286 +0000 UTC m=+937.072744947" lastFinishedPulling="2025-12-05 20:21:37.854159833 +0000 UTC m=+963.150975494" observedRunningTime="2025-12-05 20:21:39.502489272 +0000 UTC m=+964.799304933" watchObservedRunningTime="2025-12-05 20:21:39.531294848 +0000 UTC m=+964.828110509" Dec 05 20:21:39 crc kubenswrapper[4885]: I1205 20:21:39.532726 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-t4xtt" podStartSLOduration=3.633346959 podStartE2EDuration="29.532718992s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.989851604 +0000 UTC m=+937.286667275" lastFinishedPulling="2025-12-05 20:21:37.889223647 +0000 UTC m=+963.186039308" observedRunningTime="2025-12-05 20:21:39.478575022 +0000 UTC m=+964.775390683" watchObservedRunningTime="2025-12-05 20:21:39.532718992 +0000 UTC m=+964.829534643" Dec 05 20:21:40 crc kubenswrapper[4885]: I1205 20:21:40.342248 4885 generic.go:334] "Generic (PLEG): container finished" podID="3ccb9d8c-dba1-494f-9d52-1b607aba77f2" containerID="31c3717e9dc2ef32edbd7abc4ef84f8dc7ce7a03c4d9d7f511aa57bd1207ff0f" exitCode=0 Dec 05 20:21:40 crc kubenswrapper[4885]: I1205 20:21:40.342317 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqjfv" event={"ID":"3ccb9d8c-dba1-494f-9d52-1b607aba77f2","Type":"ContainerDied","Data":"31c3717e9dc2ef32edbd7abc4ef84f8dc7ce7a03c4d9d7f511aa57bd1207ff0f"} Dec 05 20:21:40 crc kubenswrapper[4885]: I1205 20:21:40.344865 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk" event={"ID":"33f07e6f-9ac8-461d-b455-ad634c2e255c","Type":"ContainerStarted","Data":"30c10fb94b95a6791ad0b2a26dcd3dd5a2af95d33fd430a8f99ffbc74b2ecb6f"} Dec 05 20:21:40 crc kubenswrapper[4885]: I1205 20:21:40.348742 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch" event={"ID":"06e1a4eb-c6cb-4146-b2f9-484c2e699a7e","Type":"ContainerStarted","Data":"a27d6889055cfe17b4406e27077d834064a96f9a6e1762b0b9902932cf0ecaf1"} Dec 05 20:21:40 crc kubenswrapper[4885]: I1205 20:21:40.348820 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch" Dec 05 20:21:40 crc kubenswrapper[4885]: I1205 20:21:40.351383 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" event={"ID":"f9775930-6d69-4ad4-a249-f5d2f270b365","Type":"ContainerStarted","Data":"03b8ae174e4855e29df98748a6071c7a5d1ef762a1979598d0379a03457090ce"} Dec 05 20:21:40 crc kubenswrapper[4885]: I1205 20:21:40.353481 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv" event={"ID":"f9ccfa3f-a548-4e32-9318-b3f2cb19ccca","Type":"ContainerStarted","Data":"21413fecdb8231204c132390ae5219f2ee9948c15c370bf1ca3c554300a001b7"} Dec 05 20:21:40 crc kubenswrapper[4885]: I1205 20:21:40.353609 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv" Dec 05 20:21:40 crc kubenswrapper[4885]: I1205 20:21:40.355406 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg9xw" event={"ID":"6f305d3b-1f60-41d0-9d30-ffd33e6a612a","Type":"ContainerStarted","Data":"f4c7f94d58cb282863fca37e308bcc30322b42da7ce9b54a7232ef1873a1f006"} Dec 05 20:21:40 crc kubenswrapper[4885]: I1205 20:21:40.358248 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-r6ljq" event={"ID":"da47cf7f-37ab-4d5d-99b1-1b312002f83e","Type":"ContainerStarted","Data":"bd09306e622b314c397aaa135c3e00c6784c8d094421bcc4355d2ef29b1f8128"} Dec 05 20:21:40 crc kubenswrapper[4885]: I1205 20:21:40.359707 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-r6ljq" Dec 05 20:21:40 crc kubenswrapper[4885]: I1205 20:21:40.384674 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cg9xw" podStartSLOduration=9.649802142 podStartE2EDuration="22.38465663s" podCreationTimestamp="2025-12-05 20:21:18 +0000 UTC" firstStartedPulling="2025-12-05 20:21:26.958300644 +0000 UTC m=+952.255116305" lastFinishedPulling="2025-12-05 20:21:39.693155132 +0000 UTC m=+964.989970793" observedRunningTime="2025-12-05 20:21:40.381565791 +0000 UTC m=+965.678381462" watchObservedRunningTime="2025-12-05 20:21:40.38465663 +0000 UTC m=+965.681472301" Dec 05 20:21:40 crc kubenswrapper[4885]: I1205 20:21:40.400248 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch" podStartSLOduration=4.583449446 podStartE2EDuration="30.400233024s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.929989572 +0000 UTC m=+937.226805233" lastFinishedPulling="2025-12-05 20:21:37.74677315 +0000 UTC m=+963.043588811" observedRunningTime="2025-12-05 20:21:40.399737948 +0000 UTC m=+965.696553609" watchObservedRunningTime="2025-12-05 20:21:40.400233024 +0000 UTC m=+965.697048685" Dec 05 20:21:40 crc kubenswrapper[4885]: I1205 20:21:40.416182 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-r6ljq" podStartSLOduration=2.6526900810000003 podStartE2EDuration="30.416166011s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.730280344 +0000 UTC m=+937.027096005" lastFinishedPulling="2025-12-05 20:21:39.493756274 +0000 UTC m=+964.790571935" observedRunningTime="2025-12-05 20:21:40.414656853 +0000 UTC m=+965.711472514" watchObservedRunningTime="2025-12-05 20:21:40.416166011 +0000 UTC m=+965.712981672" Dec 05 20:21:40 crc kubenswrapper[4885]: I1205 20:21:40.436643 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv" podStartSLOduration=4.789929159 podStartE2EDuration="30.436625661s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:12.071915303 +0000 UTC m=+937.368730964" lastFinishedPulling="2025-12-05 20:21:37.718611805 +0000 UTC m=+963.015427466" observedRunningTime="2025-12-05 20:21:40.434063909 +0000 UTC m=+965.730879590" watchObservedRunningTime="2025-12-05 20:21:40.436625661 +0000 UTC m=+965.733441322" Dec 05 20:21:40 crc kubenswrapper[4885]: I1205 20:21:40.638651 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m" Dec 05 20:21:41 crc kubenswrapper[4885]: E1205 20:21:41.338990 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" podUID="49b39782-af0e-4f86-89f4-96582b6a8336" Dec 05 20:21:41 crc kubenswrapper[4885]: I1205 20:21:41.366878 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqjfv" event={"ID":"3ccb9d8c-dba1-494f-9d52-1b607aba77f2","Type":"ContainerStarted","Data":"565114308b5870d3b9b58cdf71ff0b37f6e72c683100a21462ad211b93e387f1"} Dec 05 20:21:41 crc kubenswrapper[4885]: I1205 20:21:41.368325 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" event={"ID":"49b39782-af0e-4f86-89f4-96582b6a8336","Type":"ContainerStarted","Data":"42346bbb4b442899d1f627cb987e452c9aaf5fe27151b7fa6441dc8279b7f47f"} Dec 05 20:21:41 crc kubenswrapper[4885]: E1205 20:21:41.369659 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" podUID="49b39782-af0e-4f86-89f4-96582b6a8336" Dec 05 20:21:41 crc kubenswrapper[4885]: I1205 20:21:41.370805 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m" event={"ID":"3e2eaf31-e16e-4072-ae6b-a5c9eda46732","Type":"ContainerStarted","Data":"fa6ecaac1b52058aae46fc5caafdb2176f5e8070a0b9e94415e32ac9e0ac878b"} Dec 05 20:21:41 crc kubenswrapper[4885]: I1205 20:21:41.385411 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pqjfv" podStartSLOduration=7.8640271219999995 podStartE2EDuration="9.385396655s" podCreationTimestamp="2025-12-05 20:21:32 +0000 UTC" firstStartedPulling="2025-12-05 20:21:39.209093347 +0000 UTC m=+964.505908998" lastFinishedPulling="2025-12-05 20:21:40.73046287 +0000 UTC m=+966.027278531" observedRunningTime="2025-12-05 20:21:41.381662547 +0000 UTC m=+966.678478218" watchObservedRunningTime="2025-12-05 20:21:41.385396655 +0000 UTC m=+966.682212316" Dec 05 20:21:41 crc kubenswrapper[4885]: I1205 20:21:41.413527 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5c5m" podStartSLOduration=17.085691421 podStartE2EDuration="31.413507309s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:11.669138741 +0000 UTC m=+936.965954402" lastFinishedPulling="2025-12-05 20:21:25.996954629 +0000 UTC m=+951.293770290" observedRunningTime="2025-12-05 20:21:41.412298391 +0000 UTC m=+966.709114062" watchObservedRunningTime="2025-12-05 20:21:41.413507309 +0000 UTC m=+966.710322990" Dec 05 20:21:42 crc kubenswrapper[4885]: E1205 20:21:42.174108 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qpp7t" podUID="18cedf03-5e88-4513-b2cc-e364e749f219" Dec 05 20:21:42 crc kubenswrapper[4885]: I1205 20:21:42.696221 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5qfqlm\" (UID: \"fdb3c987-9d79-4920-9b95-1be3a3dbc622\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" Dec 05 20:21:42 crc kubenswrapper[4885]: I1205 20:21:42.707132 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdb3c987-9d79-4920-9b95-1be3a3dbc622-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5qfqlm\" (UID: \"fdb3c987-9d79-4920-9b95-1be3a3dbc622\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" Dec 05 20:21:42 crc kubenswrapper[4885]: I1205 20:21:42.976948 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-jv2ts" Dec 05 20:21:42 crc kubenswrapper[4885]: I1205 20:21:42.985881 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" Dec 05 20:21:43 crc kubenswrapper[4885]: I1205 20:21:43.001119 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:43 crc kubenswrapper[4885]: I1205 20:21:43.001215 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:43 crc kubenswrapper[4885]: I1205 20:21:43.005703 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:43 crc kubenswrapper[4885]: I1205 20:21:43.007488 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/acaad339-be87-48ab-aee8-7f4637190768-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b47j2\" (UID: \"acaad339-be87-48ab-aee8-7f4637190768\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:43 crc kubenswrapper[4885]: I1205 20:21:43.188576 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-brmc2" Dec 05 20:21:43 crc kubenswrapper[4885]: I1205 20:21:43.198861 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:43 crc kubenswrapper[4885]: I1205 20:21:43.238100 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:21:43 crc kubenswrapper[4885]: I1205 20:21:43.238266 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:21:43 crc kubenswrapper[4885]: I1205 20:21:43.289517 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:21:43 crc kubenswrapper[4885]: I1205 20:21:43.411370 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm"] Dec 05 20:21:43 crc kubenswrapper[4885]: I1205 20:21:43.638261 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2"] Dec 05 20:21:43 crc kubenswrapper[4885]: W1205 20:21:43.645302 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacaad339_be87_48ab_aee8_7f4637190768.slice/crio-0e499623614d02c92bfd209a0628508b90b83aadac2fe39cb325b5ea1563c2f3 WatchSource:0}: Error finding container 0e499623614d02c92bfd209a0628508b90b83aadac2fe39cb325b5ea1563c2f3: Status 404 returned error can't find the container with id 0e499623614d02c92bfd209a0628508b90b83aadac2fe39cb325b5ea1563c2f3 Dec 05 20:21:44 crc kubenswrapper[4885]: I1205 20:21:44.400183 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" event={"ID":"acaad339-be87-48ab-aee8-7f4637190768","Type":"ContainerStarted","Data":"98d44ae5472df85ba20ab50aaf08dadb1597b7f88d0f5f96e9bc19652cdac21b"} Dec 05 20:21:44 crc kubenswrapper[4885]: I1205 20:21:44.400442 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" event={"ID":"acaad339-be87-48ab-aee8-7f4637190768","Type":"ContainerStarted","Data":"0e499623614d02c92bfd209a0628508b90b83aadac2fe39cb325b5ea1563c2f3"} Dec 05 20:21:44 crc kubenswrapper[4885]: I1205 20:21:44.401188 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:44 crc kubenswrapper[4885]: I1205 20:21:44.406046 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" event={"ID":"fdb3c987-9d79-4920-9b95-1be3a3dbc622","Type":"ContainerStarted","Data":"1cc15fc6bcd8941cac1049f5510a78ce729e00cca7a42e50a43961326297e5db"} Dec 05 20:21:44 crc kubenswrapper[4885]: I1205 20:21:44.445413 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" podStartSLOduration=34.44538778 podStartE2EDuration="34.44538778s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:21:44.439990829 +0000 UTC m=+969.736806580" watchObservedRunningTime="2025-12-05 20:21:44.44538778 +0000 UTC m=+969.742203461" Dec 05 20:21:46 crc kubenswrapper[4885]: I1205 20:21:46.288933 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dpqcg" Dec 05 20:21:46 crc kubenswrapper[4885]: I1205 20:21:46.423568 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" event={"ID":"fdb3c987-9d79-4920-9b95-1be3a3dbc622","Type":"ContainerStarted","Data":"acaf370b870a4b285e190702c56cb7fe414120cee3dcda8f7c044d3fb898ef8c"} Dec 05 20:21:46 crc kubenswrapper[4885]: I1205 20:21:46.423631 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" event={"ID":"fdb3c987-9d79-4920-9b95-1be3a3dbc622","Type":"ContainerStarted","Data":"2f6d7ff73056a49c578ad2ea679af9b957c8d0752fc0c19386efcea28ba41655"} Dec 05 20:21:46 crc kubenswrapper[4885]: I1205 20:21:46.423810 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" Dec 05 20:21:46 crc kubenswrapper[4885]: I1205 20:21:46.456584 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" podStartSLOduration=34.552484735 podStartE2EDuration="36.456565591s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:43.417380148 +0000 UTC m=+968.714195809" lastFinishedPulling="2025-12-05 20:21:45.321461004 +0000 UTC m=+970.618276665" observedRunningTime="2025-12-05 20:21:46.448851067 +0000 UTC m=+971.745666728" watchObservedRunningTime="2025-12-05 20:21:46.456565591 +0000 UTC m=+971.753381252" Dec 05 20:21:46 crc kubenswrapper[4885]: I1205 20:21:46.631322 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:21:46 crc kubenswrapper[4885]: I1205 20:21:46.631406 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:21:46 crc kubenswrapper[4885]: I1205 20:21:46.631466 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:21:46 crc kubenswrapper[4885]: I1205 20:21:46.632348 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"838d57f53907a18978ccf285771525c5f73a2f0a8cab487f678fbc79c5b8663f"} pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:21:46 crc kubenswrapper[4885]: I1205 20:21:46.632451 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" containerID="cri-o://838d57f53907a18978ccf285771525c5f73a2f0a8cab487f678fbc79c5b8663f" gracePeriod=600 Dec 05 20:21:47 crc kubenswrapper[4885]: I1205 20:21:47.434763 4885 generic.go:334] "Generic (PLEG): container finished" podID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerID="838d57f53907a18978ccf285771525c5f73a2f0a8cab487f678fbc79c5b8663f" exitCode=0 Dec 05 20:21:47 crc kubenswrapper[4885]: I1205 20:21:47.434808 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerDied","Data":"838d57f53907a18978ccf285771525c5f73a2f0a8cab487f678fbc79c5b8663f"} Dec 05 20:21:47 crc kubenswrapper[4885]: I1205 20:21:47.435376 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"7059cc5d928871aedc23182a22e9ba744742e5284851e631b5de955d05b94f8c"} Dec 05 20:21:47 crc kubenswrapper[4885]: I1205 20:21:47.435411 4885 scope.go:117] "RemoveContainer" containerID="3a54d873f48017e0ab1882609207d2134ae0f9e98ed286e2389ccf25d46ab55d" Dec 05 20:21:48 crc kubenswrapper[4885]: I1205 20:21:48.838832 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:48 crc kubenswrapper[4885]: I1205 20:21:48.839253 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:48 crc kubenswrapper[4885]: I1205 20:21:48.919307 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:49 crc kubenswrapper[4885]: I1205 20:21:49.499988 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:49 crc kubenswrapper[4885]: I1205 20:21:49.553489 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg9xw"] Dec 05 20:21:50 crc kubenswrapper[4885]: I1205 20:21:50.542912 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-r6ljq" Dec 05 20:21:50 crc kubenswrapper[4885]: I1205 20:21:50.607673 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-z4wtk" Dec 05 20:21:50 crc kubenswrapper[4885]: I1205 20:21:50.998663 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t4mch" Dec 05 20:21:51 crc kubenswrapper[4885]: I1205 20:21:51.141785 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4q2vd" Dec 05 20:21:51 crc kubenswrapper[4885]: I1205 20:21:51.267617 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rqs2p" Dec 05 20:21:51 crc kubenswrapper[4885]: I1205 20:21:51.382353 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nrtkv" Dec 05 20:21:51 crc kubenswrapper[4885]: I1205 20:21:51.464953 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cg9xw" podUID="6f305d3b-1f60-41d0-9d30-ffd33e6a612a" containerName="registry-server" containerID="cri-o://f4c7f94d58cb282863fca37e308bcc30322b42da7ce9b54a7232ef1873a1f006" gracePeriod=2 Dec 05 20:21:51 crc kubenswrapper[4885]: I1205 20:21:51.848633 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.048037 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqtmr\" (UniqueName: \"kubernetes.io/projected/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-kube-api-access-fqtmr\") pod \"6f305d3b-1f60-41d0-9d30-ffd33e6a612a\" (UID: \"6f305d3b-1f60-41d0-9d30-ffd33e6a612a\") " Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.048107 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-catalog-content\") pod \"6f305d3b-1f60-41d0-9d30-ffd33e6a612a\" (UID: \"6f305d3b-1f60-41d0-9d30-ffd33e6a612a\") " Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.048133 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-utilities\") pod \"6f305d3b-1f60-41d0-9d30-ffd33e6a612a\" (UID: \"6f305d3b-1f60-41d0-9d30-ffd33e6a612a\") " Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.049364 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-utilities" (OuterVolumeSpecName: "utilities") pod "6f305d3b-1f60-41d0-9d30-ffd33e6a612a" (UID: "6f305d3b-1f60-41d0-9d30-ffd33e6a612a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.056562 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-kube-api-access-fqtmr" (OuterVolumeSpecName: "kube-api-access-fqtmr") pod "6f305d3b-1f60-41d0-9d30-ffd33e6a612a" (UID: "6f305d3b-1f60-41d0-9d30-ffd33e6a612a"). InnerVolumeSpecName "kube-api-access-fqtmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.074678 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f305d3b-1f60-41d0-9d30-ffd33e6a612a" (UID: "6f305d3b-1f60-41d0-9d30-ffd33e6a612a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.149775 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqtmr\" (UniqueName: \"kubernetes.io/projected/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-kube-api-access-fqtmr\") on node \"crc\" DevicePath \"\"" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.150162 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.150183 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f305d3b-1f60-41d0-9d30-ffd33e6a612a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.488363 4885 generic.go:334] "Generic (PLEG): container finished" podID="6f305d3b-1f60-41d0-9d30-ffd33e6a612a" containerID="f4c7f94d58cb282863fca37e308bcc30322b42da7ce9b54a7232ef1873a1f006" exitCode=0 Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.488675 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg9xw" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.488678 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg9xw" event={"ID":"6f305d3b-1f60-41d0-9d30-ffd33e6a612a","Type":"ContainerDied","Data":"f4c7f94d58cb282863fca37e308bcc30322b42da7ce9b54a7232ef1873a1f006"} Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.488817 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg9xw" event={"ID":"6f305d3b-1f60-41d0-9d30-ffd33e6a612a","Type":"ContainerDied","Data":"bcdfbde070bf905231d1559ffada506fac4ec098e04648b1174ebad332c7dca8"} Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.488858 4885 scope.go:117] "RemoveContainer" containerID="f4c7f94d58cb282863fca37e308bcc30322b42da7ce9b54a7232ef1873a1f006" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.522740 4885 scope.go:117] "RemoveContainer" containerID="f271f51414ffe8d9c0202095a14e944e8e37ed4ca088293034210d3862274f34" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.536405 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg9xw"] Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.542744 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg9xw"] Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.559445 4885 scope.go:117] "RemoveContainer" containerID="5194a983df22965f14c238d2ef4abadffbe3166cbfd31262587392923871399d" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.573193 4885 scope.go:117] "RemoveContainer" containerID="f4c7f94d58cb282863fca37e308bcc30322b42da7ce9b54a7232ef1873a1f006" Dec 05 20:21:52 crc kubenswrapper[4885]: E1205 20:21:52.573577 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4c7f94d58cb282863fca37e308bcc30322b42da7ce9b54a7232ef1873a1f006\": container with ID starting with f4c7f94d58cb282863fca37e308bcc30322b42da7ce9b54a7232ef1873a1f006 not found: ID does not exist" containerID="f4c7f94d58cb282863fca37e308bcc30322b42da7ce9b54a7232ef1873a1f006" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.573602 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c7f94d58cb282863fca37e308bcc30322b42da7ce9b54a7232ef1873a1f006"} err="failed to get container status \"f4c7f94d58cb282863fca37e308bcc30322b42da7ce9b54a7232ef1873a1f006\": rpc error: code = NotFound desc = could not find container \"f4c7f94d58cb282863fca37e308bcc30322b42da7ce9b54a7232ef1873a1f006\": container with ID starting with f4c7f94d58cb282863fca37e308bcc30322b42da7ce9b54a7232ef1873a1f006 not found: ID does not exist" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.573623 4885 scope.go:117] "RemoveContainer" containerID="f271f51414ffe8d9c0202095a14e944e8e37ed4ca088293034210d3862274f34" Dec 05 20:21:52 crc kubenswrapper[4885]: E1205 20:21:52.573903 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f271f51414ffe8d9c0202095a14e944e8e37ed4ca088293034210d3862274f34\": container with ID starting with f271f51414ffe8d9c0202095a14e944e8e37ed4ca088293034210d3862274f34 not found: ID does not exist" containerID="f271f51414ffe8d9c0202095a14e944e8e37ed4ca088293034210d3862274f34" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.573924 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f271f51414ffe8d9c0202095a14e944e8e37ed4ca088293034210d3862274f34"} err="failed to get container status \"f271f51414ffe8d9c0202095a14e944e8e37ed4ca088293034210d3862274f34\": rpc error: code = NotFound desc = could not find container \"f271f51414ffe8d9c0202095a14e944e8e37ed4ca088293034210d3862274f34\": container with ID starting with f271f51414ffe8d9c0202095a14e944e8e37ed4ca088293034210d3862274f34 not found: ID does not exist" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.573935 4885 scope.go:117] "RemoveContainer" containerID="5194a983df22965f14c238d2ef4abadffbe3166cbfd31262587392923871399d" Dec 05 20:21:52 crc kubenswrapper[4885]: E1205 20:21:52.574186 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5194a983df22965f14c238d2ef4abadffbe3166cbfd31262587392923871399d\": container with ID starting with 5194a983df22965f14c238d2ef4abadffbe3166cbfd31262587392923871399d not found: ID does not exist" containerID="5194a983df22965f14c238d2ef4abadffbe3166cbfd31262587392923871399d" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.574210 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5194a983df22965f14c238d2ef4abadffbe3166cbfd31262587392923871399d"} err="failed to get container status \"5194a983df22965f14c238d2ef4abadffbe3166cbfd31262587392923871399d\": rpc error: code = NotFound desc = could not find container \"5194a983df22965f14c238d2ef4abadffbe3166cbfd31262587392923871399d\": container with ID starting with 5194a983df22965f14c238d2ef4abadffbe3166cbfd31262587392923871399d not found: ID does not exist" Dec 05 20:21:52 crc kubenswrapper[4885]: I1205 20:21:52.992235 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5qfqlm" Dec 05 20:21:53 crc kubenswrapper[4885]: I1205 20:21:53.186336 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f305d3b-1f60-41d0-9d30-ffd33e6a612a" path="/var/lib/kubelet/pods/6f305d3b-1f60-41d0-9d30-ffd33e6a612a/volumes" Dec 05 20:21:53 crc kubenswrapper[4885]: I1205 20:21:53.205454 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b47j2" Dec 05 20:21:53 crc kubenswrapper[4885]: I1205 20:21:53.342950 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:21:54 crc kubenswrapper[4885]: I1205 20:21:54.764126 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pqjfv"] Dec 05 20:21:54 crc kubenswrapper[4885]: I1205 20:21:54.764761 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pqjfv" podUID="3ccb9d8c-dba1-494f-9d52-1b607aba77f2" containerName="registry-server" containerID="cri-o://565114308b5870d3b9b58cdf71ff0b37f6e72c683100a21462ad211b93e387f1" gracePeriod=2 Dec 05 20:21:55 crc kubenswrapper[4885]: I1205 20:21:55.514409 4885 generic.go:334] "Generic (PLEG): container finished" podID="3ccb9d8c-dba1-494f-9d52-1b607aba77f2" containerID="565114308b5870d3b9b58cdf71ff0b37f6e72c683100a21462ad211b93e387f1" exitCode=0 Dec 05 20:21:55 crc kubenswrapper[4885]: I1205 20:21:55.514423 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqjfv" event={"ID":"3ccb9d8c-dba1-494f-9d52-1b607aba77f2","Type":"ContainerDied","Data":"565114308b5870d3b9b58cdf71ff0b37f6e72c683100a21462ad211b93e387f1"} Dec 05 20:21:59 crc kubenswrapper[4885]: I1205 20:21:59.718716 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:21:59 crc kubenswrapper[4885]: I1205 20:21:59.825245 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knrtq\" (UniqueName: \"kubernetes.io/projected/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-kube-api-access-knrtq\") pod \"3ccb9d8c-dba1-494f-9d52-1b607aba77f2\" (UID: \"3ccb9d8c-dba1-494f-9d52-1b607aba77f2\") " Dec 05 20:21:59 crc kubenswrapper[4885]: I1205 20:21:59.825416 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-catalog-content\") pod \"3ccb9d8c-dba1-494f-9d52-1b607aba77f2\" (UID: \"3ccb9d8c-dba1-494f-9d52-1b607aba77f2\") " Dec 05 20:21:59 crc kubenswrapper[4885]: I1205 20:21:59.825446 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-utilities\") pod \"3ccb9d8c-dba1-494f-9d52-1b607aba77f2\" (UID: \"3ccb9d8c-dba1-494f-9d52-1b607aba77f2\") " Dec 05 20:21:59 crc kubenswrapper[4885]: I1205 20:21:59.826646 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-utilities" (OuterVolumeSpecName: "utilities") pod "3ccb9d8c-dba1-494f-9d52-1b607aba77f2" (UID: "3ccb9d8c-dba1-494f-9d52-1b607aba77f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:21:59 crc kubenswrapper[4885]: I1205 20:21:59.832311 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-kube-api-access-knrtq" (OuterVolumeSpecName: "kube-api-access-knrtq") pod "3ccb9d8c-dba1-494f-9d52-1b607aba77f2" (UID: "3ccb9d8c-dba1-494f-9d52-1b607aba77f2"). InnerVolumeSpecName "kube-api-access-knrtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:21:59 crc kubenswrapper[4885]: I1205 20:21:59.883948 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ccb9d8c-dba1-494f-9d52-1b607aba77f2" (UID: "3ccb9d8c-dba1-494f-9d52-1b607aba77f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:21:59 crc kubenswrapper[4885]: I1205 20:21:59.926769 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:21:59 crc kubenswrapper[4885]: I1205 20:21:59.926801 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:21:59 crc kubenswrapper[4885]: I1205 20:21:59.926815 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knrtq\" (UniqueName: \"kubernetes.io/projected/3ccb9d8c-dba1-494f-9d52-1b607aba77f2-kube-api-access-knrtq\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:00 crc kubenswrapper[4885]: I1205 20:22:00.554744 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qpp7t" event={"ID":"18cedf03-5e88-4513-b2cc-e364e749f219","Type":"ContainerStarted","Data":"c50610a72bc2d5e041d6da9bffcf12c1fc3a99c9e7125bef40ef20b8a166799d"} Dec 05 20:22:00 crc kubenswrapper[4885]: I1205 20:22:00.557971 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqjfv" event={"ID":"3ccb9d8c-dba1-494f-9d52-1b607aba77f2","Type":"ContainerDied","Data":"896465794d6cb28cee59dcb53af6a1d982ea5ed769342706adae912006507e87"} Dec 05 20:22:00 crc kubenswrapper[4885]: I1205 20:22:00.558065 4885 scope.go:117] "RemoveContainer" containerID="565114308b5870d3b9b58cdf71ff0b37f6e72c683100a21462ad211b93e387f1" Dec 05 20:22:00 crc kubenswrapper[4885]: I1205 20:22:00.558063 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pqjfv" Dec 05 20:22:00 crc kubenswrapper[4885]: I1205 20:22:00.570520 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qpp7t" podStartSLOduration=2.518600155 podStartE2EDuration="50.570501081s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:12.152910227 +0000 UTC m=+937.449725888" lastFinishedPulling="2025-12-05 20:22:00.204811153 +0000 UTC m=+985.501626814" observedRunningTime="2025-12-05 20:22:00.567304891 +0000 UTC m=+985.864120572" watchObservedRunningTime="2025-12-05 20:22:00.570501081 +0000 UTC m=+985.867316742" Dec 05 20:22:00 crc kubenswrapper[4885]: I1205 20:22:00.597881 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pqjfv"] Dec 05 20:22:00 crc kubenswrapper[4885]: I1205 20:22:00.603935 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pqjfv"] Dec 05 20:22:00 crc kubenswrapper[4885]: I1205 20:22:00.820324 4885 scope.go:117] "RemoveContainer" containerID="31c3717e9dc2ef32edbd7abc4ef84f8dc7ce7a03c4d9d7f511aa57bd1207ff0f" Dec 05 20:22:00 crc kubenswrapper[4885]: I1205 20:22:00.863142 4885 scope.go:117] "RemoveContainer" containerID="faaa14a4276b53835a663c15cdae1677cb8befba539772d4ecdf13baa2f599c1" Dec 05 20:22:01 crc kubenswrapper[4885]: I1205 20:22:01.183825 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ccb9d8c-dba1-494f-9d52-1b607aba77f2" path="/var/lib/kubelet/pods/3ccb9d8c-dba1-494f-9d52-1b607aba77f2/volumes" Dec 05 20:22:01 crc kubenswrapper[4885]: I1205 20:22:01.568057 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" event={"ID":"49b39782-af0e-4f86-89f4-96582b6a8336","Type":"ContainerStarted","Data":"dbac2d59e69f0967111f5e0354be61d9a04540c6719b543d9b41971f694e3cf7"} Dec 05 20:22:01 crc kubenswrapper[4885]: I1205 20:22:01.568639 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" Dec 05 20:22:01 crc kubenswrapper[4885]: I1205 20:22:01.595259 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" podStartSLOduration=2.814520666 podStartE2EDuration="51.595235206s" podCreationTimestamp="2025-12-05 20:21:10 +0000 UTC" firstStartedPulling="2025-12-05 20:21:12.084200223 +0000 UTC m=+937.381015884" lastFinishedPulling="2025-12-05 20:22:00.864914753 +0000 UTC m=+986.161730424" observedRunningTime="2025-12-05 20:22:01.586521484 +0000 UTC m=+986.883337145" watchObservedRunningTime="2025-12-05 20:22:01.595235206 +0000 UTC m=+986.892050877" Dec 05 20:22:11 crc kubenswrapper[4885]: I1205 20:22:11.281773 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-565xh" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.475892 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-z4279"] Dec 05 20:22:28 crc kubenswrapper[4885]: E1205 20:22:28.476777 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f305d3b-1f60-41d0-9d30-ffd33e6a612a" containerName="extract-utilities" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.476793 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f305d3b-1f60-41d0-9d30-ffd33e6a612a" containerName="extract-utilities" Dec 05 20:22:28 crc kubenswrapper[4885]: E1205 20:22:28.476809 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ccb9d8c-dba1-494f-9d52-1b607aba77f2" containerName="extract-content" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.476817 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ccb9d8c-dba1-494f-9d52-1b607aba77f2" containerName="extract-content" Dec 05 20:22:28 crc kubenswrapper[4885]: E1205 20:22:28.476841 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ccb9d8c-dba1-494f-9d52-1b607aba77f2" containerName="registry-server" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.476850 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ccb9d8c-dba1-494f-9d52-1b607aba77f2" containerName="registry-server" Dec 05 20:22:28 crc kubenswrapper[4885]: E1205 20:22:28.476873 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f305d3b-1f60-41d0-9d30-ffd33e6a612a" containerName="registry-server" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.476883 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f305d3b-1f60-41d0-9d30-ffd33e6a612a" containerName="registry-server" Dec 05 20:22:28 crc kubenswrapper[4885]: E1205 20:22:28.484115 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ccb9d8c-dba1-494f-9d52-1b607aba77f2" containerName="extract-utilities" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.484132 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ccb9d8c-dba1-494f-9d52-1b607aba77f2" containerName="extract-utilities" Dec 05 20:22:28 crc kubenswrapper[4885]: E1205 20:22:28.484156 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f305d3b-1f60-41d0-9d30-ffd33e6a612a" containerName="extract-content" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.484167 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f305d3b-1f60-41d0-9d30-ffd33e6a612a" containerName="extract-content" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.484445 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ccb9d8c-dba1-494f-9d52-1b607aba77f2" containerName="registry-server" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.484464 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f305d3b-1f60-41d0-9d30-ffd33e6a612a" containerName="registry-server" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.485323 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-z4279" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.491356 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-z4279"] Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.496663 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.497092 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-pcp9k" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.497294 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.497509 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.556562 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cffaecab-046b-446e-b28f-555ce23a48ae-config\") pod \"dnsmasq-dns-5cd484bb89-z4279\" (UID: \"cffaecab-046b-446e-b28f-555ce23a48ae\") " pod="openstack/dnsmasq-dns-5cd484bb89-z4279" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.556701 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xjmt\" (UniqueName: \"kubernetes.io/projected/cffaecab-046b-446e-b28f-555ce23a48ae-kube-api-access-7xjmt\") pod \"dnsmasq-dns-5cd484bb89-z4279\" (UID: \"cffaecab-046b-446e-b28f-555ce23a48ae\") " pod="openstack/dnsmasq-dns-5cd484bb89-z4279" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.561806 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-567c455747-8m8g9"] Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.563145 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-8m8g9" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.569652 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.572796 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567c455747-8m8g9"] Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.657731 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xjmt\" (UniqueName: \"kubernetes.io/projected/cffaecab-046b-446e-b28f-555ce23a48ae-kube-api-access-7xjmt\") pod \"dnsmasq-dns-5cd484bb89-z4279\" (UID: \"cffaecab-046b-446e-b28f-555ce23a48ae\") " pod="openstack/dnsmasq-dns-5cd484bb89-z4279" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.657793 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13095c2a-f954-4e11-8f7d-815a25686e25-dns-svc\") pod \"dnsmasq-dns-567c455747-8m8g9\" (UID: \"13095c2a-f954-4e11-8f7d-815a25686e25\") " pod="openstack/dnsmasq-dns-567c455747-8m8g9" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.657832 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cffaecab-046b-446e-b28f-555ce23a48ae-config\") pod \"dnsmasq-dns-5cd484bb89-z4279\" (UID: \"cffaecab-046b-446e-b28f-555ce23a48ae\") " pod="openstack/dnsmasq-dns-5cd484bb89-z4279" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.657860 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq4dt\" (UniqueName: \"kubernetes.io/projected/13095c2a-f954-4e11-8f7d-815a25686e25-kube-api-access-vq4dt\") pod \"dnsmasq-dns-567c455747-8m8g9\" (UID: \"13095c2a-f954-4e11-8f7d-815a25686e25\") " pod="openstack/dnsmasq-dns-567c455747-8m8g9" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.657913 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13095c2a-f954-4e11-8f7d-815a25686e25-config\") pod \"dnsmasq-dns-567c455747-8m8g9\" (UID: \"13095c2a-f954-4e11-8f7d-815a25686e25\") " pod="openstack/dnsmasq-dns-567c455747-8m8g9" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.659063 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cffaecab-046b-446e-b28f-555ce23a48ae-config\") pod \"dnsmasq-dns-5cd484bb89-z4279\" (UID: \"cffaecab-046b-446e-b28f-555ce23a48ae\") " pod="openstack/dnsmasq-dns-5cd484bb89-z4279" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.678890 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xjmt\" (UniqueName: \"kubernetes.io/projected/cffaecab-046b-446e-b28f-555ce23a48ae-kube-api-access-7xjmt\") pod \"dnsmasq-dns-5cd484bb89-z4279\" (UID: \"cffaecab-046b-446e-b28f-555ce23a48ae\") " pod="openstack/dnsmasq-dns-5cd484bb89-z4279" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.758987 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13095c2a-f954-4e11-8f7d-815a25686e25-config\") pod \"dnsmasq-dns-567c455747-8m8g9\" (UID: \"13095c2a-f954-4e11-8f7d-815a25686e25\") " pod="openstack/dnsmasq-dns-567c455747-8m8g9" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.759095 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13095c2a-f954-4e11-8f7d-815a25686e25-dns-svc\") pod \"dnsmasq-dns-567c455747-8m8g9\" (UID: \"13095c2a-f954-4e11-8f7d-815a25686e25\") " pod="openstack/dnsmasq-dns-567c455747-8m8g9" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.759882 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13095c2a-f954-4e11-8f7d-815a25686e25-config\") pod \"dnsmasq-dns-567c455747-8m8g9\" (UID: \"13095c2a-f954-4e11-8f7d-815a25686e25\") " pod="openstack/dnsmasq-dns-567c455747-8m8g9" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.759943 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13095c2a-f954-4e11-8f7d-815a25686e25-dns-svc\") pod \"dnsmasq-dns-567c455747-8m8g9\" (UID: \"13095c2a-f954-4e11-8f7d-815a25686e25\") " pod="openstack/dnsmasq-dns-567c455747-8m8g9" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.759988 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq4dt\" (UniqueName: \"kubernetes.io/projected/13095c2a-f954-4e11-8f7d-815a25686e25-kube-api-access-vq4dt\") pod \"dnsmasq-dns-567c455747-8m8g9\" (UID: \"13095c2a-f954-4e11-8f7d-815a25686e25\") " pod="openstack/dnsmasq-dns-567c455747-8m8g9" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.781890 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq4dt\" (UniqueName: \"kubernetes.io/projected/13095c2a-f954-4e11-8f7d-815a25686e25-kube-api-access-vq4dt\") pod \"dnsmasq-dns-567c455747-8m8g9\" (UID: \"13095c2a-f954-4e11-8f7d-815a25686e25\") " pod="openstack/dnsmasq-dns-567c455747-8m8g9" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.835473 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-z4279" Dec 05 20:22:28 crc kubenswrapper[4885]: I1205 20:22:28.881208 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-8m8g9" Dec 05 20:22:29 crc kubenswrapper[4885]: I1205 20:22:29.297971 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567c455747-8m8g9"] Dec 05 20:22:29 crc kubenswrapper[4885]: I1205 20:22:29.306630 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-z4279"] Dec 05 20:22:29 crc kubenswrapper[4885]: W1205 20:22:29.309281 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13095c2a_f954_4e11_8f7d_815a25686e25.slice/crio-c388a0b720bfb2165cdec7d5dcab9dcc24884aa09e7e070ab1ef42c5316d4a89 WatchSource:0}: Error finding container c388a0b720bfb2165cdec7d5dcab9dcc24884aa09e7e070ab1ef42c5316d4a89: Status 404 returned error can't find the container with id c388a0b720bfb2165cdec7d5dcab9dcc24884aa09e7e070ab1ef42c5316d4a89 Dec 05 20:22:29 crc kubenswrapper[4885]: W1205 20:22:29.309615 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcffaecab_046b_446e_b28f_555ce23a48ae.slice/crio-6b74b1235c660955fa0f74633d45cf81a6f325105dd0272e479a4ffca9586ab6 WatchSource:0}: Error finding container 6b74b1235c660955fa0f74633d45cf81a6f325105dd0272e479a4ffca9586ab6: Status 404 returned error can't find the container with id 6b74b1235c660955fa0f74633d45cf81a6f325105dd0272e479a4ffca9586ab6 Dec 05 20:22:29 crc kubenswrapper[4885]: I1205 20:22:29.312271 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:22:29 crc kubenswrapper[4885]: I1205 20:22:29.809446 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-z4279" event={"ID":"cffaecab-046b-446e-b28f-555ce23a48ae","Type":"ContainerStarted","Data":"6b74b1235c660955fa0f74633d45cf81a6f325105dd0272e479a4ffca9586ab6"} Dec 05 20:22:29 crc kubenswrapper[4885]: I1205 20:22:29.811870 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-8m8g9" event={"ID":"13095c2a-f954-4e11-8f7d-815a25686e25","Type":"ContainerStarted","Data":"c388a0b720bfb2165cdec7d5dcab9dcc24884aa09e7e070ab1ef42c5316d4a89"} Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.588084 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567c455747-8m8g9"] Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.620915 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-whjvl"] Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.623511 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.656207 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-whjvl"] Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.701941 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjcr8\" (UniqueName: \"kubernetes.io/projected/9e19c4d9-3055-4b50-b37d-f02aab457b39-kube-api-access-sjcr8\") pod \"dnsmasq-dns-bc4b48fc9-whjvl\" (UID: \"9e19c4d9-3055-4b50-b37d-f02aab457b39\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.702000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e19c4d9-3055-4b50-b37d-f02aab457b39-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-whjvl\" (UID: \"9e19c4d9-3055-4b50-b37d-f02aab457b39\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.702036 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e19c4d9-3055-4b50-b37d-f02aab457b39-config\") pod \"dnsmasq-dns-bc4b48fc9-whjvl\" (UID: \"9e19c4d9-3055-4b50-b37d-f02aab457b39\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.802876 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjcr8\" (UniqueName: \"kubernetes.io/projected/9e19c4d9-3055-4b50-b37d-f02aab457b39-kube-api-access-sjcr8\") pod \"dnsmasq-dns-bc4b48fc9-whjvl\" (UID: \"9e19c4d9-3055-4b50-b37d-f02aab457b39\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.802944 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e19c4d9-3055-4b50-b37d-f02aab457b39-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-whjvl\" (UID: \"9e19c4d9-3055-4b50-b37d-f02aab457b39\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.802966 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e19c4d9-3055-4b50-b37d-f02aab457b39-config\") pod \"dnsmasq-dns-bc4b48fc9-whjvl\" (UID: \"9e19c4d9-3055-4b50-b37d-f02aab457b39\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.803836 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e19c4d9-3055-4b50-b37d-f02aab457b39-config\") pod \"dnsmasq-dns-bc4b48fc9-whjvl\" (UID: \"9e19c4d9-3055-4b50-b37d-f02aab457b39\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.803979 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e19c4d9-3055-4b50-b37d-f02aab457b39-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-whjvl\" (UID: \"9e19c4d9-3055-4b50-b37d-f02aab457b39\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.837594 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjcr8\" (UniqueName: \"kubernetes.io/projected/9e19c4d9-3055-4b50-b37d-f02aab457b39-kube-api-access-sjcr8\") pod \"dnsmasq-dns-bc4b48fc9-whjvl\" (UID: \"9e19c4d9-3055-4b50-b37d-f02aab457b39\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.910931 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-z4279"] Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.932013 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb666b895-sg9cr"] Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.933199 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-sg9cr" Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.946158 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-sg9cr"] Dec 05 20:22:31 crc kubenswrapper[4885]: I1205 20:22:31.949495 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" Dec 05 20:22:32 crc kubenswrapper[4885]: I1205 20:22:32.006550 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2522c28b-c324-408d-a8b7-7e3d83709a6a-config\") pod \"dnsmasq-dns-cb666b895-sg9cr\" (UID: \"2522c28b-c324-408d-a8b7-7e3d83709a6a\") " pod="openstack/dnsmasq-dns-cb666b895-sg9cr" Dec 05 20:22:32 crc kubenswrapper[4885]: I1205 20:22:32.006703 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2522c28b-c324-408d-a8b7-7e3d83709a6a-dns-svc\") pod \"dnsmasq-dns-cb666b895-sg9cr\" (UID: \"2522c28b-c324-408d-a8b7-7e3d83709a6a\") " pod="openstack/dnsmasq-dns-cb666b895-sg9cr" Dec 05 20:22:32 crc kubenswrapper[4885]: I1205 20:22:32.006731 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tzxj\" (UniqueName: \"kubernetes.io/projected/2522c28b-c324-408d-a8b7-7e3d83709a6a-kube-api-access-8tzxj\") pod \"dnsmasq-dns-cb666b895-sg9cr\" (UID: \"2522c28b-c324-408d-a8b7-7e3d83709a6a\") " pod="openstack/dnsmasq-dns-cb666b895-sg9cr" Dec 05 20:22:32 crc kubenswrapper[4885]: I1205 20:22:32.109914 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2522c28b-c324-408d-a8b7-7e3d83709a6a-dns-svc\") pod \"dnsmasq-dns-cb666b895-sg9cr\" (UID: \"2522c28b-c324-408d-a8b7-7e3d83709a6a\") " pod="openstack/dnsmasq-dns-cb666b895-sg9cr" Dec 05 20:22:32 crc kubenswrapper[4885]: I1205 20:22:32.109960 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tzxj\" (UniqueName: \"kubernetes.io/projected/2522c28b-c324-408d-a8b7-7e3d83709a6a-kube-api-access-8tzxj\") pod \"dnsmasq-dns-cb666b895-sg9cr\" (UID: \"2522c28b-c324-408d-a8b7-7e3d83709a6a\") " pod="openstack/dnsmasq-dns-cb666b895-sg9cr" Dec 05 20:22:32 crc kubenswrapper[4885]: I1205 20:22:32.109984 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2522c28b-c324-408d-a8b7-7e3d83709a6a-config\") pod \"dnsmasq-dns-cb666b895-sg9cr\" (UID: \"2522c28b-c324-408d-a8b7-7e3d83709a6a\") " pod="openstack/dnsmasq-dns-cb666b895-sg9cr" Dec 05 20:22:32 crc kubenswrapper[4885]: I1205 20:22:32.111076 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2522c28b-c324-408d-a8b7-7e3d83709a6a-config\") pod \"dnsmasq-dns-cb666b895-sg9cr\" (UID: \"2522c28b-c324-408d-a8b7-7e3d83709a6a\") " pod="openstack/dnsmasq-dns-cb666b895-sg9cr" Dec 05 20:22:32 crc kubenswrapper[4885]: I1205 20:22:32.111666 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2522c28b-c324-408d-a8b7-7e3d83709a6a-dns-svc\") pod \"dnsmasq-dns-cb666b895-sg9cr\" (UID: \"2522c28b-c324-408d-a8b7-7e3d83709a6a\") " pod="openstack/dnsmasq-dns-cb666b895-sg9cr" Dec 05 20:22:32 crc kubenswrapper[4885]: I1205 20:22:32.160497 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tzxj\" (UniqueName: \"kubernetes.io/projected/2522c28b-c324-408d-a8b7-7e3d83709a6a-kube-api-access-8tzxj\") pod \"dnsmasq-dns-cb666b895-sg9cr\" (UID: \"2522c28b-c324-408d-a8b7-7e3d83709a6a\") " pod="openstack/dnsmasq-dns-cb666b895-sg9cr" Dec 05 20:22:32 crc kubenswrapper[4885]: I1205 20:22:32.252447 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-sg9cr" Dec 05 20:22:32 crc kubenswrapper[4885]: I1205 20:22:32.487659 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-whjvl"] Dec 05 20:22:32 crc kubenswrapper[4885]: I1205 20:22:32.687540 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-sg9cr"] Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.241160 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.242982 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.246198 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.247306 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.247791 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.248012 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.248163 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mkf59" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.248299 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.248313 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.249996 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.250810 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.252154 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.254616 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.255131 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.255444 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.258542 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.260153 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-w8rlv" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.261625 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.272853 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.280958 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432248 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432290 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432312 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432329 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432349 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432369 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432383 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432396 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432418 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-config-data\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432436 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432454 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432481 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432496 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432513 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432528 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432552 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432564 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432588 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qsbc\" (UniqueName: \"kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-kube-api-access-6qsbc\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432602 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432619 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432632 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fczd7\" (UniqueName: \"kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-kube-api-access-fczd7\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.432653 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.533963 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534048 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534075 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534142 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534165 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534184 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534217 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-config-data\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534241 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534270 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534312 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534337 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534359 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534407 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534446 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534463 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534493 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qsbc\" (UniqueName: \"kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-kube-api-access-6qsbc\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534518 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534544 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534565 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fczd7\" (UniqueName: \"kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-kube-api-access-fczd7\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534600 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534622 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534643 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.534786 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.535049 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.535979 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.536741 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.537496 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.537831 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.538301 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.538168 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.538584 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.539467 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-config-data\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.540060 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.540631 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.541179 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.541539 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.541648 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.541839 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.548182 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.548539 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.550555 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.554225 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fczd7\" (UniqueName: \"kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-kube-api-access-fczd7\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.560198 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qsbc\" (UniqueName: \"kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-kube-api-access-6qsbc\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.560272 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.573370 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.575348 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.586579 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 20:22:33 crc kubenswrapper[4885]: I1205 20:22:33.875916 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.533466 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.535210 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.539459 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.539696 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.540163 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.540250 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-r949x" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.543516 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.553567 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.653988 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1a8619-8184-43c1-9444-8e86fbc4213d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.654317 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwj64\" (UniqueName: \"kubernetes.io/projected/3e1a8619-8184-43c1-9444-8e86fbc4213d-kube-api-access-hwj64\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.654344 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3e1a8619-8184-43c1-9444-8e86fbc4213d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.654410 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3e1a8619-8184-43c1-9444-8e86fbc4213d-kolla-config\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.654430 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1a8619-8184-43c1-9444-8e86fbc4213d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.654450 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.654480 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1a8619-8184-43c1-9444-8e86fbc4213d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.654528 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3e1a8619-8184-43c1-9444-8e86fbc4213d-config-data-default\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.756091 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwj64\" (UniqueName: \"kubernetes.io/projected/3e1a8619-8184-43c1-9444-8e86fbc4213d-kube-api-access-hwj64\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.756135 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3e1a8619-8184-43c1-9444-8e86fbc4213d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.756201 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3e1a8619-8184-43c1-9444-8e86fbc4213d-kolla-config\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.756221 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1a8619-8184-43c1-9444-8e86fbc4213d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.756244 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.756276 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1a8619-8184-43c1-9444-8e86fbc4213d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.756298 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3e1a8619-8184-43c1-9444-8e86fbc4213d-config-data-default\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.756321 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1a8619-8184-43c1-9444-8e86fbc4213d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.757359 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3e1a8619-8184-43c1-9444-8e86fbc4213d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.757775 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3e1a8619-8184-43c1-9444-8e86fbc4213d-kolla-config\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.758511 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3e1a8619-8184-43c1-9444-8e86fbc4213d-config-data-default\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.758666 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.758906 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1a8619-8184-43c1-9444-8e86fbc4213d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.761644 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1a8619-8184-43c1-9444-8e86fbc4213d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.768749 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1a8619-8184-43c1-9444-8e86fbc4213d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.773056 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwj64\" (UniqueName: \"kubernetes.io/projected/3e1a8619-8184-43c1-9444-8e86fbc4213d-kube-api-access-hwj64\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.781673 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"3e1a8619-8184-43c1-9444-8e86fbc4213d\") " pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.863731 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 20:22:34 crc kubenswrapper[4885]: I1205 20:22:34.892172 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-sg9cr" event={"ID":"2522c28b-c324-408d-a8b7-7e3d83709a6a","Type":"ContainerStarted","Data":"96f85ac3ec9d703566c091de6c7c6f6d574049ceb4f57a346b377e201a43cc22"} Dec 05 20:22:35 crc kubenswrapper[4885]: I1205 20:22:35.903643 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" event={"ID":"9e19c4d9-3055-4b50-b37d-f02aab457b39","Type":"ContainerStarted","Data":"b8866bb97e88fac93b2fc2ac16af47c9a6b400c81066e89aa4f0d01ff01f8b93"} Dec 05 20:22:35 crc kubenswrapper[4885]: I1205 20:22:35.924494 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 20:22:35 crc kubenswrapper[4885]: I1205 20:22:35.930340 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:35 crc kubenswrapper[4885]: I1205 20:22:35.933358 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-j7k9l" Dec 05 20:22:35 crc kubenswrapper[4885]: I1205 20:22:35.934198 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 05 20:22:35 crc kubenswrapper[4885]: I1205 20:22:35.934388 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 05 20:22:35 crc kubenswrapper[4885]: I1205 20:22:35.934596 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 05 20:22:35 crc kubenswrapper[4885]: I1205 20:22:35.941385 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.075341 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.075382 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93184776-73bf-4ff3-9f7f-66b46fd511ed-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.075400 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93184776-73bf-4ff3-9f7f-66b46fd511ed-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.075424 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93184776-73bf-4ff3-9f7f-66b46fd511ed-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.075448 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4hfz\" (UniqueName: \"kubernetes.io/projected/93184776-73bf-4ff3-9f7f-66b46fd511ed-kube-api-access-p4hfz\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.075465 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93184776-73bf-4ff3-9f7f-66b46fd511ed-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.075499 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93184776-73bf-4ff3-9f7f-66b46fd511ed-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.075515 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93184776-73bf-4ff3-9f7f-66b46fd511ed-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.177442 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.177698 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93184776-73bf-4ff3-9f7f-66b46fd511ed-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.177789 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.177911 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93184776-73bf-4ff3-9f7f-66b46fd511ed-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.178096 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93184776-73bf-4ff3-9f7f-66b46fd511ed-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.178233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4hfz\" (UniqueName: \"kubernetes.io/projected/93184776-73bf-4ff3-9f7f-66b46fd511ed-kube-api-access-p4hfz\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.178361 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93184776-73bf-4ff3-9f7f-66b46fd511ed-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.178492 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93184776-73bf-4ff3-9f7f-66b46fd511ed-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.178608 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93184776-73bf-4ff3-9f7f-66b46fd511ed-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.178803 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93184776-73bf-4ff3-9f7f-66b46fd511ed-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.178895 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93184776-73bf-4ff3-9f7f-66b46fd511ed-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.179630 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93184776-73bf-4ff3-9f7f-66b46fd511ed-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.179730 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93184776-73bf-4ff3-9f7f-66b46fd511ed-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.183752 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93184776-73bf-4ff3-9f7f-66b46fd511ed-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.194594 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93184776-73bf-4ff3-9f7f-66b46fd511ed-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.197101 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4hfz\" (UniqueName: \"kubernetes.io/projected/93184776-73bf-4ff3-9f7f-66b46fd511ed-kube-api-access-p4hfz\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.199496 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"93184776-73bf-4ff3-9f7f-66b46fd511ed\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.252584 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.448734 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.452999 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.455632 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.455721 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.456117 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-k5kq8" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.457066 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.583477 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12c40607-2770-4b97-95f1-6ac26280d337-combined-ca-bundle\") pod \"memcached-0\" (UID: \"12c40607-2770-4b97-95f1-6ac26280d337\") " pod="openstack/memcached-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.583539 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12c40607-2770-4b97-95f1-6ac26280d337-kolla-config\") pod \"memcached-0\" (UID: \"12c40607-2770-4b97-95f1-6ac26280d337\") " pod="openstack/memcached-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.583571 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12c40607-2770-4b97-95f1-6ac26280d337-config-data\") pod \"memcached-0\" (UID: \"12c40607-2770-4b97-95f1-6ac26280d337\") " pod="openstack/memcached-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.583608 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trvzj\" (UniqueName: \"kubernetes.io/projected/12c40607-2770-4b97-95f1-6ac26280d337-kube-api-access-trvzj\") pod \"memcached-0\" (UID: \"12c40607-2770-4b97-95f1-6ac26280d337\") " pod="openstack/memcached-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.583686 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/12c40607-2770-4b97-95f1-6ac26280d337-memcached-tls-certs\") pod \"memcached-0\" (UID: \"12c40607-2770-4b97-95f1-6ac26280d337\") " pod="openstack/memcached-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.684800 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12c40607-2770-4b97-95f1-6ac26280d337-config-data\") pod \"memcached-0\" (UID: \"12c40607-2770-4b97-95f1-6ac26280d337\") " pod="openstack/memcached-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.684872 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trvzj\" (UniqueName: \"kubernetes.io/projected/12c40607-2770-4b97-95f1-6ac26280d337-kube-api-access-trvzj\") pod \"memcached-0\" (UID: \"12c40607-2770-4b97-95f1-6ac26280d337\") " pod="openstack/memcached-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.684950 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/12c40607-2770-4b97-95f1-6ac26280d337-memcached-tls-certs\") pod \"memcached-0\" (UID: \"12c40607-2770-4b97-95f1-6ac26280d337\") " pod="openstack/memcached-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.685006 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12c40607-2770-4b97-95f1-6ac26280d337-combined-ca-bundle\") pod \"memcached-0\" (UID: \"12c40607-2770-4b97-95f1-6ac26280d337\") " pod="openstack/memcached-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.685056 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12c40607-2770-4b97-95f1-6ac26280d337-kolla-config\") pod \"memcached-0\" (UID: \"12c40607-2770-4b97-95f1-6ac26280d337\") " pod="openstack/memcached-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.685722 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12c40607-2770-4b97-95f1-6ac26280d337-kolla-config\") pod \"memcached-0\" (UID: \"12c40607-2770-4b97-95f1-6ac26280d337\") " pod="openstack/memcached-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.685722 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12c40607-2770-4b97-95f1-6ac26280d337-config-data\") pod \"memcached-0\" (UID: \"12c40607-2770-4b97-95f1-6ac26280d337\") " pod="openstack/memcached-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.688182 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/12c40607-2770-4b97-95f1-6ac26280d337-memcached-tls-certs\") pod \"memcached-0\" (UID: \"12c40607-2770-4b97-95f1-6ac26280d337\") " pod="openstack/memcached-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.690608 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12c40607-2770-4b97-95f1-6ac26280d337-combined-ca-bundle\") pod \"memcached-0\" (UID: \"12c40607-2770-4b97-95f1-6ac26280d337\") " pod="openstack/memcached-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.706711 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trvzj\" (UniqueName: \"kubernetes.io/projected/12c40607-2770-4b97-95f1-6ac26280d337-kube-api-access-trvzj\") pod \"memcached-0\" (UID: \"12c40607-2770-4b97-95f1-6ac26280d337\") " pod="openstack/memcached-0" Dec 05 20:22:36 crc kubenswrapper[4885]: I1205 20:22:36.816350 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 20:22:38 crc kubenswrapper[4885]: I1205 20:22:38.292055 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:22:38 crc kubenswrapper[4885]: I1205 20:22:38.293345 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 20:22:38 crc kubenswrapper[4885]: I1205 20:22:38.295543 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2phxg" Dec 05 20:22:38 crc kubenswrapper[4885]: I1205 20:22:38.298345 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:22:38 crc kubenswrapper[4885]: I1205 20:22:38.411869 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khp4x\" (UniqueName: \"kubernetes.io/projected/c2ff7d19-e58f-467f-aaed-fc34e25e6dc0-kube-api-access-khp4x\") pod \"kube-state-metrics-0\" (UID: \"c2ff7d19-e58f-467f-aaed-fc34e25e6dc0\") " pod="openstack/kube-state-metrics-0" Dec 05 20:22:38 crc kubenswrapper[4885]: I1205 20:22:38.514358 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khp4x\" (UniqueName: \"kubernetes.io/projected/c2ff7d19-e58f-467f-aaed-fc34e25e6dc0-kube-api-access-khp4x\") pod \"kube-state-metrics-0\" (UID: \"c2ff7d19-e58f-467f-aaed-fc34e25e6dc0\") " pod="openstack/kube-state-metrics-0" Dec 05 20:22:38 crc kubenswrapper[4885]: I1205 20:22:38.530604 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khp4x\" (UniqueName: \"kubernetes.io/projected/c2ff7d19-e58f-467f-aaed-fc34e25e6dc0-kube-api-access-khp4x\") pod \"kube-state-metrics-0\" (UID: \"c2ff7d19-e58f-467f-aaed-fc34e25e6dc0\") " pod="openstack/kube-state-metrics-0" Dec 05 20:22:38 crc kubenswrapper[4885]: I1205 20:22:38.609702 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.099035 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ptwvl"] Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.101115 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.103839 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ptwvl"] Dec 05 20:22:43 crc kubenswrapper[4885]: E1205 20:22:43.110975 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 05 20:22:43 crc kubenswrapper[4885]: E1205 20:22:43.111140 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vq4dt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-567c455747-8m8g9_openstack(13095c2a-f954-4e11-8f7d-815a25686e25): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.111747 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-27pvh" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.112002 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 05 20:22:43 crc kubenswrapper[4885]: E1205 20:22:43.113370 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-567c455747-8m8g9" podUID="13095c2a-f954-4e11-8f7d-815a25686e25" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.140227 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.158483 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-hgth4"] Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.162138 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.166644 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hgth4"] Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.300127 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-combined-ca-bundle\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.300391 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/32c5b9a2-f65e-4223-ac3f-f49a4e160454-etc-ovs\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.300433 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32c5b9a2-f65e-4223-ac3f-f49a4e160454-scripts\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.300495 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-ovn-controller-tls-certs\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.300521 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-var-run-ovn\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.300574 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqdr7\" (UniqueName: \"kubernetes.io/projected/32c5b9a2-f65e-4223-ac3f-f49a4e160454-kube-api-access-gqdr7\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.300597 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/32c5b9a2-f65e-4223-ac3f-f49a4e160454-var-log\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.300634 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/32c5b9a2-f65e-4223-ac3f-f49a4e160454-var-lib\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.300662 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsclb\" (UniqueName: \"kubernetes.io/projected/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-kube-api-access-hsclb\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.300682 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-scripts\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.300714 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-var-log-ovn\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.300736 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/32c5b9a2-f65e-4223-ac3f-f49a4e160454-var-run\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.301374 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-var-run\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.402806 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/32c5b9a2-f65e-4223-ac3f-f49a4e160454-var-lib\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.402860 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsclb\" (UniqueName: \"kubernetes.io/projected/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-kube-api-access-hsclb\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.402882 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-scripts\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.402908 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-var-log-ovn\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.402929 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/32c5b9a2-f65e-4223-ac3f-f49a4e160454-var-run\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.402946 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-var-run\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.402984 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-combined-ca-bundle\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.403011 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/32c5b9a2-f65e-4223-ac3f-f49a4e160454-etc-ovs\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.403050 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32c5b9a2-f65e-4223-ac3f-f49a4e160454-scripts\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.403070 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-ovn-controller-tls-certs\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.403101 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-var-run-ovn\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.403134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqdr7\" (UniqueName: \"kubernetes.io/projected/32c5b9a2-f65e-4223-ac3f-f49a4e160454-kube-api-access-gqdr7\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.403159 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/32c5b9a2-f65e-4223-ac3f-f49a4e160454-var-log\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.403684 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/32c5b9a2-f65e-4223-ac3f-f49a4e160454-var-lib\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.404239 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/32c5b9a2-f65e-4223-ac3f-f49a4e160454-var-log\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.404468 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-var-log-ovn\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.405107 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/32c5b9a2-f65e-4223-ac3f-f49a4e160454-var-run\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.405211 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-var-run-ovn\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.406288 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-var-run\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.407523 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/32c5b9a2-f65e-4223-ac3f-f49a4e160454-etc-ovs\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.407548 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-scripts\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.407744 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32c5b9a2-f65e-4223-ac3f-f49a4e160454-scripts\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.411996 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-ovn-controller-tls-certs\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.424770 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqdr7\" (UniqueName: \"kubernetes.io/projected/32c5b9a2-f65e-4223-ac3f-f49a4e160454-kube-api-access-gqdr7\") pod \"ovn-controller-ovs-hgth4\" (UID: \"32c5b9a2-f65e-4223-ac3f-f49a4e160454\") " pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.425876 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsclb\" (UniqueName: \"kubernetes.io/projected/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-kube-api-access-hsclb\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.430729 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99-combined-ca-bundle\") pod \"ovn-controller-ptwvl\" (UID: \"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99\") " pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.514412 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.571142 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.582477 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:43 crc kubenswrapper[4885]: W1205 20:22:43.644776 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e1a8619_8184_43c1_9444_8e86fbc4213d.slice/crio-daf15f07de710da42eb677903460f341bb0b99b57aa98e0e26af1c8035661cbe WatchSource:0}: Error finding container daf15f07de710da42eb677903460f341bb0b99b57aa98e0e26af1c8035661cbe: Status 404 returned error can't find the container with id daf15f07de710da42eb677903460f341bb0b99b57aa98e0e26af1c8035661cbe Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.657419 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 20:22:43 crc kubenswrapper[4885]: W1205 20:22:43.659752 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9ea268e_bdd3_4ff6_b04c_f15e8bc98d68.slice/crio-b833772c175681c8e67a2eb332f9c311bb97f55f73a4fd4d359ffe79e15279d5 WatchSource:0}: Error finding container b833772c175681c8e67a2eb332f9c311bb97f55f73a4fd4d359ffe79e15279d5: Status 404 returned error can't find the container with id b833772c175681c8e67a2eb332f9c311bb97f55f73a4fd4d359ffe79e15279d5 Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.667490 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.675874 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:22:43 crc kubenswrapper[4885]: W1205 20:22:43.684395 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a65b714_cb9c_4ce6_a5eb_5ebe8a7b2bee.slice/crio-cd892dc43a46ad8270548b30d8e8d28ac09267d52acf697d8b254d0c18e27f61 WatchSource:0}: Error finding container cd892dc43a46ad8270548b30d8e8d28ac09267d52acf697d8b254d0c18e27f61: Status 404 returned error can't find the container with id cd892dc43a46ad8270548b30d8e8d28ac09267d52acf697d8b254d0c18e27f61 Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.805073 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.815884 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 20:22:43 crc kubenswrapper[4885]: W1205 20:22:43.819437 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2ff7d19_e58f_467f_aaed_fc34e25e6dc0.slice/crio-07a6348626dc6d688e414928c3b7eee660610dbbc8f06731c679d93f8b8a003b WatchSource:0}: Error finding container 07a6348626dc6d688e414928c3b7eee660610dbbc8f06731c679d93f8b8a003b: Status 404 returned error can't find the container with id 07a6348626dc6d688e414928c3b7eee660610dbbc8f06731c679d93f8b8a003b Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.963396 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.965299 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.980663 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.980704 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.982236 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.982261 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.982820 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-jfqn8" Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.986942 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.993634 4885 generic.go:334] "Generic (PLEG): container finished" podID="9e19c4d9-3055-4b50-b37d-f02aab457b39" containerID="38d2e52846fe76574b539bc78708594216223af70902d4bc7768aa703260f760" exitCode=0 Dec 05 20:22:43 crc kubenswrapper[4885]: I1205 20:22:43.993691 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" event={"ID":"9e19c4d9-3055-4b50-b37d-f02aab457b39","Type":"ContainerDied","Data":"38d2e52846fe76574b539bc78708594216223af70902d4bc7768aa703260f760"} Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.009788 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c2ff7d19-e58f-467f-aaed-fc34e25e6dc0","Type":"ContainerStarted","Data":"07a6348626dc6d688e414928c3b7eee660610dbbc8f06731c679d93f8b8a003b"} Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.038308 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3e1a8619-8184-43c1-9444-8e86fbc4213d","Type":"ContainerStarted","Data":"daf15f07de710da42eb677903460f341bb0b99b57aa98e0e26af1c8035661cbe"} Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.044801 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"93184776-73bf-4ff3-9f7f-66b46fd511ed","Type":"ContainerStarted","Data":"2a88435649033eb2b9270e001bb6040da6a60f5eaddceb7208d3985aae0e7b39"} Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.052313 4885 generic.go:334] "Generic (PLEG): container finished" podID="cffaecab-046b-446e-b28f-555ce23a48ae" containerID="60b107f75640c5fa916c2122af7211bb2bd53e8adf60e09338fbbad4adf3433d" exitCode=0 Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.052684 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-z4279" event={"ID":"cffaecab-046b-446e-b28f-555ce23a48ae","Type":"ContainerDied","Data":"60b107f75640c5fa916c2122af7211bb2bd53e8adf60e09338fbbad4adf3433d"} Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.062107 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ptwvl"] Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.087411 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68","Type":"ContainerStarted","Data":"b833772c175681c8e67a2eb332f9c311bb97f55f73a4fd4d359ffe79e15279d5"} Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.103186 4885 generic.go:334] "Generic (PLEG): container finished" podID="2522c28b-c324-408d-a8b7-7e3d83709a6a" containerID="c0e0b6a7d108f7e0afb543795b1f8f317ad869723d8c3fbb2e4dd8d6a98eab0f" exitCode=0 Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.103279 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-sg9cr" event={"ID":"2522c28b-c324-408d-a8b7-7e3d83709a6a","Type":"ContainerDied","Data":"c0e0b6a7d108f7e0afb543795b1f8f317ad869723d8c3fbb2e4dd8d6a98eab0f"} Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.115438 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee","Type":"ContainerStarted","Data":"cd892dc43a46ad8270548b30d8e8d28ac09267d52acf697d8b254d0c18e27f61"} Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.120777 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7edaf8ab-283b-46bc-89e2-a3c8f681624b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.120875 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7edaf8ab-283b-46bc-89e2-a3c8f681624b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.120907 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.120988 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2d6p\" (UniqueName: \"kubernetes.io/projected/7edaf8ab-283b-46bc-89e2-a3c8f681624b-kube-api-access-x2d6p\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.121036 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7edaf8ab-283b-46bc-89e2-a3c8f681624b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.121068 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7edaf8ab-283b-46bc-89e2-a3c8f681624b-config\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.121115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7edaf8ab-283b-46bc-89e2-a3c8f681624b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.121146 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7edaf8ab-283b-46bc-89e2-a3c8f681624b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.140464 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"12c40607-2770-4b97-95f1-6ac26280d337","Type":"ContainerStarted","Data":"6f84301493cc7c7fc52392a18979bd32cded9d2944bb2e0f300d0ea8a18a96bc"} Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.225396 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2d6p\" (UniqueName: \"kubernetes.io/projected/7edaf8ab-283b-46bc-89e2-a3c8f681624b-kube-api-access-x2d6p\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.225964 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7edaf8ab-283b-46bc-89e2-a3c8f681624b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.226126 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7edaf8ab-283b-46bc-89e2-a3c8f681624b-config\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.226287 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7edaf8ab-283b-46bc-89e2-a3c8f681624b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.226413 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7edaf8ab-283b-46bc-89e2-a3c8f681624b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.226536 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7edaf8ab-283b-46bc-89e2-a3c8f681624b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.227458 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7edaf8ab-283b-46bc-89e2-a3c8f681624b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.241228 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7edaf8ab-283b-46bc-89e2-a3c8f681624b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.241288 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.244331 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.244759 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7edaf8ab-283b-46bc-89e2-a3c8f681624b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.245031 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7edaf8ab-283b-46bc-89e2-a3c8f681624b-config\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.245122 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7edaf8ab-283b-46bc-89e2-a3c8f681624b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.248435 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7edaf8ab-283b-46bc-89e2-a3c8f681624b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.250336 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7edaf8ab-283b-46bc-89e2-a3c8f681624b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.264340 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2d6p\" (UniqueName: \"kubernetes.io/projected/7edaf8ab-283b-46bc-89e2-a3c8f681624b-kube-api-access-x2d6p\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.306662 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hgth4"] Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.317314 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7edaf8ab-283b-46bc-89e2-a3c8f681624b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: E1205 20:22:44.483874 4885 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 05 20:22:44 crc kubenswrapper[4885]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/9e19c4d9-3055-4b50-b37d-f02aab457b39/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 05 20:22:44 crc kubenswrapper[4885]: > podSandboxID="b8866bb97e88fac93b2fc2ac16af47c9a6b400c81066e89aa4f0d01ff01f8b93" Dec 05 20:22:44 crc kubenswrapper[4885]: E1205 20:22:44.484028 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 05 20:22:44 crc kubenswrapper[4885]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sjcr8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-bc4b48fc9-whjvl_openstack(9e19c4d9-3055-4b50-b37d-f02aab457b39): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/9e19c4d9-3055-4b50-b37d-f02aab457b39/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 05 20:22:44 crc kubenswrapper[4885]: > logger="UnhandledError" Dec 05 20:22:44 crc kubenswrapper[4885]: E1205 20:22:44.488616 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/9e19c4d9-3055-4b50-b37d-f02aab457b39/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" podUID="9e19c4d9-3055-4b50-b37d-f02aab457b39" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.518186 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-z7wfg"] Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.519360 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.522978 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.529263 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-z7wfg"] Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.551335 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28451893-15ed-4dc1-a6ef-f93fed27316e-combined-ca-bundle\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.551601 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28451893-15ed-4dc1-a6ef-f93fed27316e-config\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.551623 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/28451893-15ed-4dc1-a6ef-f93fed27316e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.551656 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/28451893-15ed-4dc1-a6ef-f93fed27316e-ovs-rundir\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.551731 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbfdw\" (UniqueName: \"kubernetes.io/projected/28451893-15ed-4dc1-a6ef-f93fed27316e-kube-api-access-gbfdw\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.551763 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/28451893-15ed-4dc1-a6ef-f93fed27316e-ovn-rundir\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.602754 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.655173 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbfdw\" (UniqueName: \"kubernetes.io/projected/28451893-15ed-4dc1-a6ef-f93fed27316e-kube-api-access-gbfdw\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.655319 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/28451893-15ed-4dc1-a6ef-f93fed27316e-ovn-rundir\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.655471 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28451893-15ed-4dc1-a6ef-f93fed27316e-combined-ca-bundle\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.655499 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28451893-15ed-4dc1-a6ef-f93fed27316e-config\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.655585 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/28451893-15ed-4dc1-a6ef-f93fed27316e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.655626 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/28451893-15ed-4dc1-a6ef-f93fed27316e-ovs-rundir\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.656200 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/28451893-15ed-4dc1-a6ef-f93fed27316e-ovs-rundir\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.657171 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/28451893-15ed-4dc1-a6ef-f93fed27316e-ovn-rundir\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.657400 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28451893-15ed-4dc1-a6ef-f93fed27316e-config\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.660658 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/28451893-15ed-4dc1-a6ef-f93fed27316e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.663259 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28451893-15ed-4dc1-a6ef-f93fed27316e-combined-ca-bundle\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.677538 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbfdw\" (UniqueName: \"kubernetes.io/projected/28451893-15ed-4dc1-a6ef-f93fed27316e-kube-api-access-gbfdw\") pod \"ovn-controller-metrics-z7wfg\" (UID: \"28451893-15ed-4dc1-a6ef-f93fed27316e\") " pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.745674 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-z4279" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.746780 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-8m8g9" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.757057 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13095c2a-f954-4e11-8f7d-815a25686e25-dns-svc\") pod \"13095c2a-f954-4e11-8f7d-815a25686e25\" (UID: \"13095c2a-f954-4e11-8f7d-815a25686e25\") " Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.757191 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13095c2a-f954-4e11-8f7d-815a25686e25-config\") pod \"13095c2a-f954-4e11-8f7d-815a25686e25\" (UID: \"13095c2a-f954-4e11-8f7d-815a25686e25\") " Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.757352 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq4dt\" (UniqueName: \"kubernetes.io/projected/13095c2a-f954-4e11-8f7d-815a25686e25-kube-api-access-vq4dt\") pod \"13095c2a-f954-4e11-8f7d-815a25686e25\" (UID: \"13095c2a-f954-4e11-8f7d-815a25686e25\") " Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.757403 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cffaecab-046b-446e-b28f-555ce23a48ae-config\") pod \"cffaecab-046b-446e-b28f-555ce23a48ae\" (UID: \"cffaecab-046b-446e-b28f-555ce23a48ae\") " Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.757541 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xjmt\" (UniqueName: \"kubernetes.io/projected/cffaecab-046b-446e-b28f-555ce23a48ae-kube-api-access-7xjmt\") pod \"cffaecab-046b-446e-b28f-555ce23a48ae\" (UID: \"cffaecab-046b-446e-b28f-555ce23a48ae\") " Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.757934 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13095c2a-f954-4e11-8f7d-815a25686e25-config" (OuterVolumeSpecName: "config") pod "13095c2a-f954-4e11-8f7d-815a25686e25" (UID: "13095c2a-f954-4e11-8f7d-815a25686e25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.758535 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13095c2a-f954-4e11-8f7d-815a25686e25-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.762830 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13095c2a-f954-4e11-8f7d-815a25686e25-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "13095c2a-f954-4e11-8f7d-815a25686e25" (UID: "13095c2a-f954-4e11-8f7d-815a25686e25"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.763065 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cffaecab-046b-446e-b28f-555ce23a48ae-kube-api-access-7xjmt" (OuterVolumeSpecName: "kube-api-access-7xjmt") pod "cffaecab-046b-446e-b28f-555ce23a48ae" (UID: "cffaecab-046b-446e-b28f-555ce23a48ae"). InnerVolumeSpecName "kube-api-access-7xjmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.765222 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13095c2a-f954-4e11-8f7d-815a25686e25-kube-api-access-vq4dt" (OuterVolumeSpecName: "kube-api-access-vq4dt") pod "13095c2a-f954-4e11-8f7d-815a25686e25" (UID: "13095c2a-f954-4e11-8f7d-815a25686e25"). InnerVolumeSpecName "kube-api-access-vq4dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.810332 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cffaecab-046b-446e-b28f-555ce23a48ae-config" (OuterVolumeSpecName: "config") pod "cffaecab-046b-446e-b28f-555ce23a48ae" (UID: "cffaecab-046b-446e-b28f-555ce23a48ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.843699 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-z7wfg" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.859722 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq4dt\" (UniqueName: \"kubernetes.io/projected/13095c2a-f954-4e11-8f7d-815a25686e25-kube-api-access-vq4dt\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.859763 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cffaecab-046b-446e-b28f-555ce23a48ae-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.859780 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xjmt\" (UniqueName: \"kubernetes.io/projected/cffaecab-046b-446e-b28f-555ce23a48ae-kube-api-access-7xjmt\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:44 crc kubenswrapper[4885]: I1205 20:22:44.859811 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13095c2a-f954-4e11-8f7d-815a25686e25-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:45 crc kubenswrapper[4885]: I1205 20:22:45.145647 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-z7wfg"] Dec 05 20:22:45 crc kubenswrapper[4885]: I1205 20:22:45.149739 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-z4279" Dec 05 20:22:45 crc kubenswrapper[4885]: I1205 20:22:45.150139 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-z4279" event={"ID":"cffaecab-046b-446e-b28f-555ce23a48ae","Type":"ContainerDied","Data":"6b74b1235c660955fa0f74633d45cf81a6f325105dd0272e479a4ffca9586ab6"} Dec 05 20:22:45 crc kubenswrapper[4885]: I1205 20:22:45.150219 4885 scope.go:117] "RemoveContainer" containerID="60b107f75640c5fa916c2122af7211bb2bd53e8adf60e09338fbbad4adf3433d" Dec 05 20:22:45 crc kubenswrapper[4885]: I1205 20:22:45.157034 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-8m8g9" event={"ID":"13095c2a-f954-4e11-8f7d-815a25686e25","Type":"ContainerDied","Data":"c388a0b720bfb2165cdec7d5dcab9dcc24884aa09e7e070ab1ef42c5316d4a89"} Dec 05 20:22:45 crc kubenswrapper[4885]: I1205 20:22:45.157082 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-8m8g9" Dec 05 20:22:45 crc kubenswrapper[4885]: I1205 20:22:45.158288 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ptwvl" event={"ID":"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99","Type":"ContainerStarted","Data":"071d1b9f2090be845fb63b8dd70f7d95cd7253a574132a57fb566666be2baae6"} Dec 05 20:22:45 crc kubenswrapper[4885]: I1205 20:22:45.159130 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hgth4" event={"ID":"32c5b9a2-f65e-4223-ac3f-f49a4e160454","Type":"ContainerStarted","Data":"5d625ec02ab0376a1d174dd6b2448315017f0ef890638fffbb1ed1572b303194"} Dec 05 20:22:45 crc kubenswrapper[4885]: I1205 20:22:45.169038 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-sg9cr" event={"ID":"2522c28b-c324-408d-a8b7-7e3d83709a6a","Type":"ContainerStarted","Data":"3cf8e1c6e6b8c7fd66373a16e2bfbbe3451a3a140dd01dc3ecaf3f487d76527e"} Dec 05 20:22:45 crc kubenswrapper[4885]: I1205 20:22:45.169663 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb666b895-sg9cr" Dec 05 20:22:45 crc kubenswrapper[4885]: I1205 20:22:45.225196 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb666b895-sg9cr" podStartSLOduration=6.011016622 podStartE2EDuration="14.225180287s" podCreationTimestamp="2025-12-05 20:22:31 +0000 UTC" firstStartedPulling="2025-12-05 20:22:34.849705251 +0000 UTC m=+1020.146520912" lastFinishedPulling="2025-12-05 20:22:43.063868916 +0000 UTC m=+1028.360684577" observedRunningTime="2025-12-05 20:22:45.22335152 +0000 UTC m=+1030.520167181" watchObservedRunningTime="2025-12-05 20:22:45.225180287 +0000 UTC m=+1030.521995948" Dec 05 20:22:45 crc kubenswrapper[4885]: I1205 20:22:45.442983 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567c455747-8m8g9"] Dec 05 20:22:45 crc kubenswrapper[4885]: I1205 20:22:45.462478 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-567c455747-8m8g9"] Dec 05 20:22:45 crc kubenswrapper[4885]: I1205 20:22:45.479084 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-z4279"] Dec 05 20:22:45 crc kubenswrapper[4885]: I1205 20:22:45.481079 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-z4279"] Dec 05 20:22:45 crc kubenswrapper[4885]: I1205 20:22:45.546829 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 20:22:45 crc kubenswrapper[4885]: W1205 20:22:45.618118 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7edaf8ab_283b_46bc_89e2_a3c8f681624b.slice/crio-1581f34afe84f242ef485c739b13ebecfdb6998f2619957000d653f97c4a2d29 WatchSource:0}: Error finding container 1581f34afe84f242ef485c739b13ebecfdb6998f2619957000d653f97c4a2d29: Status 404 returned error can't find the container with id 1581f34afe84f242ef485c739b13ebecfdb6998f2619957000d653f97c4a2d29 Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.102393 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 20:22:46 crc kubenswrapper[4885]: E1205 20:22:46.103034 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffaecab-046b-446e-b28f-555ce23a48ae" containerName="init" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.103056 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffaecab-046b-446e-b28f-555ce23a48ae" containerName="init" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.103288 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cffaecab-046b-446e-b28f-555ce23a48ae" containerName="init" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.104931 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.109598 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.110534 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-8r9kg" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.110668 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.110938 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.112891 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.191540 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-z7wfg" event={"ID":"28451893-15ed-4dc1-a6ef-f93fed27316e","Type":"ContainerStarted","Data":"8f6c189e583316f4cb911078a5439436a5396b7b1627420841675fa5070e8ae2"} Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.193282 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7edaf8ab-283b-46bc-89e2-a3c8f681624b","Type":"ContainerStarted","Data":"1581f34afe84f242ef485c739b13ebecfdb6998f2619957000d653f97c4a2d29"} Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.286836 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/91d9cbb0-7966-411b-86e4-b80882da454e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.286905 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d9cbb0-7966-411b-86e4-b80882da454e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.286927 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vrw4\" (UniqueName: \"kubernetes.io/projected/91d9cbb0-7966-411b-86e4-b80882da454e-kube-api-access-7vrw4\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.286964 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91d9cbb0-7966-411b-86e4-b80882da454e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.287138 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d9cbb0-7966-411b-86e4-b80882da454e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.287158 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d9cbb0-7966-411b-86e4-b80882da454e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.287181 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.287209 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91d9cbb0-7966-411b-86e4-b80882da454e-config\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.389913 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/91d9cbb0-7966-411b-86e4-b80882da454e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.389970 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d9cbb0-7966-411b-86e4-b80882da454e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.389995 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vrw4\" (UniqueName: \"kubernetes.io/projected/91d9cbb0-7966-411b-86e4-b80882da454e-kube-api-access-7vrw4\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.390033 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91d9cbb0-7966-411b-86e4-b80882da454e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.390091 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d9cbb0-7966-411b-86e4-b80882da454e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.390108 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d9cbb0-7966-411b-86e4-b80882da454e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.390141 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.390216 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91d9cbb0-7966-411b-86e4-b80882da454e-config\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.390689 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.391264 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91d9cbb0-7966-411b-86e4-b80882da454e-config\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.391426 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91d9cbb0-7966-411b-86e4-b80882da454e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.391607 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/91d9cbb0-7966-411b-86e4-b80882da454e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.397560 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d9cbb0-7966-411b-86e4-b80882da454e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.397631 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d9cbb0-7966-411b-86e4-b80882da454e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.407793 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d9cbb0-7966-411b-86e4-b80882da454e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.408144 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vrw4\" (UniqueName: \"kubernetes.io/projected/91d9cbb0-7966-411b-86e4-b80882da454e-kube-api-access-7vrw4\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.419254 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"91d9cbb0-7966-411b-86e4-b80882da454e\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:46 crc kubenswrapper[4885]: I1205 20:22:46.430221 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:47 crc kubenswrapper[4885]: I1205 20:22:47.182626 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13095c2a-f954-4e11-8f7d-815a25686e25" path="/var/lib/kubelet/pods/13095c2a-f954-4e11-8f7d-815a25686e25/volumes" Dec 05 20:22:47 crc kubenswrapper[4885]: I1205 20:22:47.183456 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cffaecab-046b-446e-b28f-555ce23a48ae" path="/var/lib/kubelet/pods/cffaecab-046b-446e-b28f-555ce23a48ae/volumes" Dec 05 20:22:51 crc kubenswrapper[4885]: I1205 20:22:51.685558 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 20:22:51 crc kubenswrapper[4885]: W1205 20:22:51.980076 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91d9cbb0_7966_411b_86e4_b80882da454e.slice/crio-e7eb0ab0236699f2b438d63aac7f2dccafe156d55863a7acd80efaab2a3a07dd WatchSource:0}: Error finding container e7eb0ab0236699f2b438d63aac7f2dccafe156d55863a7acd80efaab2a3a07dd: Status 404 returned error can't find the container with id e7eb0ab0236699f2b438d63aac7f2dccafe156d55863a7acd80efaab2a3a07dd Dec 05 20:22:52 crc kubenswrapper[4885]: I1205 20:22:52.246298 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" event={"ID":"9e19c4d9-3055-4b50-b37d-f02aab457b39","Type":"ContainerStarted","Data":"d2a054d7d8c8cc63a89fd2bbd0eb4a91d82f4c76a939b169e910a3a8209230f7"} Dec 05 20:22:52 crc kubenswrapper[4885]: I1205 20:22:52.246759 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" Dec 05 20:22:52 crc kubenswrapper[4885]: I1205 20:22:52.248175 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"91d9cbb0-7966-411b-86e4-b80882da454e","Type":"ContainerStarted","Data":"e7eb0ab0236699f2b438d63aac7f2dccafe156d55863a7acd80efaab2a3a07dd"} Dec 05 20:22:52 crc kubenswrapper[4885]: I1205 20:22:52.254228 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb666b895-sg9cr" Dec 05 20:22:52 crc kubenswrapper[4885]: I1205 20:22:52.270013 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" podStartSLOduration=13.650499764 podStartE2EDuration="21.269999342s" podCreationTimestamp="2025-12-05 20:22:31 +0000 UTC" firstStartedPulling="2025-12-05 20:22:35.459887322 +0000 UTC m=+1020.756702983" lastFinishedPulling="2025-12-05 20:22:43.0793869 +0000 UTC m=+1028.376202561" observedRunningTime="2025-12-05 20:22:52.26515623 +0000 UTC m=+1037.561971931" watchObservedRunningTime="2025-12-05 20:22:52.269999342 +0000 UTC m=+1037.566815003" Dec 05 20:22:52 crc kubenswrapper[4885]: I1205 20:22:52.325870 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-whjvl"] Dec 05 20:22:54 crc kubenswrapper[4885]: I1205 20:22:54.265175 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hgth4" event={"ID":"32c5b9a2-f65e-4223-ac3f-f49a4e160454","Type":"ContainerStarted","Data":"a3663469c670c736e5269d67152b5ed9953dfe0836aa12e16e52c26348f640fc"} Dec 05 20:22:54 crc kubenswrapper[4885]: I1205 20:22:54.266723 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c2ff7d19-e58f-467f-aaed-fc34e25e6dc0","Type":"ContainerStarted","Data":"c656f3227fd86c984cc32aa4c4551af055f62b5b7fed31bb23e4be876f42b07e"} Dec 05 20:22:54 crc kubenswrapper[4885]: I1205 20:22:54.267247 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 20:22:54 crc kubenswrapper[4885]: I1205 20:22:54.269172 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"91d9cbb0-7966-411b-86e4-b80882da454e","Type":"ContainerStarted","Data":"1b1b9f86515abd9442f8673301b3cf9a574ab085ea45711a1a683d54e55e8d87"} Dec 05 20:22:54 crc kubenswrapper[4885]: I1205 20:22:54.270536 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3e1a8619-8184-43c1-9444-8e86fbc4213d","Type":"ContainerStarted","Data":"d66b598e3afa3baf34bce11d842b381f0da16761713f453e92c12706b9728797"} Dec 05 20:22:54 crc kubenswrapper[4885]: I1205 20:22:54.271928 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"93184776-73bf-4ff3-9f7f-66b46fd511ed","Type":"ContainerStarted","Data":"d4b55fdb38be1516407bacc8225f4710b58caff0de9e12a6c3a4ba1e38f1a1ce"} Dec 05 20:22:54 crc kubenswrapper[4885]: I1205 20:22:54.274140 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"12c40607-2770-4b97-95f1-6ac26280d337","Type":"ContainerStarted","Data":"d9f1a060c8e6485cc4a808dd0abd4c095c670c1926ab42ba127c717a1fa7235a"} Dec 05 20:22:54 crc kubenswrapper[4885]: I1205 20:22:54.274279 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 05 20:22:54 crc kubenswrapper[4885]: I1205 20:22:54.276168 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ptwvl" event={"ID":"0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99","Type":"ContainerStarted","Data":"03b59ff9d61c64adcc41534c4902fadac36b5427919bb749667809fa53998c1c"} Dec 05 20:22:54 crc kubenswrapper[4885]: I1205 20:22:54.276671 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ptwvl" Dec 05 20:22:54 crc kubenswrapper[4885]: I1205 20:22:54.279109 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" podUID="9e19c4d9-3055-4b50-b37d-f02aab457b39" containerName="dnsmasq-dns" containerID="cri-o://d2a054d7d8c8cc63a89fd2bbd0eb4a91d82f4c76a939b169e910a3a8209230f7" gracePeriod=10 Dec 05 20:22:54 crc kubenswrapper[4885]: I1205 20:22:54.279403 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7edaf8ab-283b-46bc-89e2-a3c8f681624b","Type":"ContainerStarted","Data":"dc2faa4d873c3705e04caf73ee1a0df85613d1604ad82553d6a930eb01dd27aa"} Dec 05 20:22:54 crc kubenswrapper[4885]: I1205 20:22:54.308834 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ptwvl" podStartSLOduration=3.792943294 podStartE2EDuration="11.308813867s" podCreationTimestamp="2025-12-05 20:22:43 +0000 UTC" firstStartedPulling="2025-12-05 20:22:44.087524336 +0000 UTC m=+1029.384339997" lastFinishedPulling="2025-12-05 20:22:51.603394909 +0000 UTC m=+1036.900210570" observedRunningTime="2025-12-05 20:22:54.303901184 +0000 UTC m=+1039.600716875" watchObservedRunningTime="2025-12-05 20:22:54.308813867 +0000 UTC m=+1039.605629528" Dec 05 20:22:54 crc kubenswrapper[4885]: I1205 20:22:54.341356 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=6.530904403 podStartE2EDuration="16.341331305s" podCreationTimestamp="2025-12-05 20:22:38 +0000 UTC" firstStartedPulling="2025-12-05 20:22:43.823513054 +0000 UTC m=+1029.120328715" lastFinishedPulling="2025-12-05 20:22:53.633939956 +0000 UTC m=+1038.930755617" observedRunningTime="2025-12-05 20:22:54.334291444 +0000 UTC m=+1039.631107105" watchObservedRunningTime="2025-12-05 20:22:54.341331305 +0000 UTC m=+1039.638146966" Dec 05 20:22:54 crc kubenswrapper[4885]: I1205 20:22:54.404253 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=10.919318476 podStartE2EDuration="18.404232212s" podCreationTimestamp="2025-12-05 20:22:36 +0000 UTC" firstStartedPulling="2025-12-05 20:22:43.681271282 +0000 UTC m=+1028.978086943" lastFinishedPulling="2025-12-05 20:22:51.166185018 +0000 UTC m=+1036.463000679" observedRunningTime="2025-12-05 20:22:54.398694759 +0000 UTC m=+1039.695510420" watchObservedRunningTime="2025-12-05 20:22:54.404232212 +0000 UTC m=+1039.701047873" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.045487 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.234334 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjcr8\" (UniqueName: \"kubernetes.io/projected/9e19c4d9-3055-4b50-b37d-f02aab457b39-kube-api-access-sjcr8\") pod \"9e19c4d9-3055-4b50-b37d-f02aab457b39\" (UID: \"9e19c4d9-3055-4b50-b37d-f02aab457b39\") " Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.234386 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e19c4d9-3055-4b50-b37d-f02aab457b39-config\") pod \"9e19c4d9-3055-4b50-b37d-f02aab457b39\" (UID: \"9e19c4d9-3055-4b50-b37d-f02aab457b39\") " Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.234438 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e19c4d9-3055-4b50-b37d-f02aab457b39-dns-svc\") pod \"9e19c4d9-3055-4b50-b37d-f02aab457b39\" (UID: \"9e19c4d9-3055-4b50-b37d-f02aab457b39\") " Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.240780 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e19c4d9-3055-4b50-b37d-f02aab457b39-kube-api-access-sjcr8" (OuterVolumeSpecName: "kube-api-access-sjcr8") pod "9e19c4d9-3055-4b50-b37d-f02aab457b39" (UID: "9e19c4d9-3055-4b50-b37d-f02aab457b39"). InnerVolumeSpecName "kube-api-access-sjcr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.270104 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e19c4d9-3055-4b50-b37d-f02aab457b39-config" (OuterVolumeSpecName: "config") pod "9e19c4d9-3055-4b50-b37d-f02aab457b39" (UID: "9e19c4d9-3055-4b50-b37d-f02aab457b39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.271517 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e19c4d9-3055-4b50-b37d-f02aab457b39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e19c4d9-3055-4b50-b37d-f02aab457b39" (UID: "9e19c4d9-3055-4b50-b37d-f02aab457b39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.311343 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.311426 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" event={"ID":"9e19c4d9-3055-4b50-b37d-f02aab457b39","Type":"ContainerDied","Data":"d2a054d7d8c8cc63a89fd2bbd0eb4a91d82f4c76a939b169e910a3a8209230f7"} Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.311734 4885 scope.go:117] "RemoveContainer" containerID="d2a054d7d8c8cc63a89fd2bbd0eb4a91d82f4c76a939b169e910a3a8209230f7" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.312145 4885 generic.go:334] "Generic (PLEG): container finished" podID="9e19c4d9-3055-4b50-b37d-f02aab457b39" containerID="d2a054d7d8c8cc63a89fd2bbd0eb4a91d82f4c76a939b169e910a3a8209230f7" exitCode=0 Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.312229 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-whjvl" event={"ID":"9e19c4d9-3055-4b50-b37d-f02aab457b39","Type":"ContainerDied","Data":"b8866bb97e88fac93b2fc2ac16af47c9a6b400c81066e89aa4f0d01ff01f8b93"} Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.315932 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68","Type":"ContainerStarted","Data":"cb7951a010b1fcef7bbdf48a7626b008b3385f640484db6e4f060850b13a3016"} Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.322028 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"91d9cbb0-7966-411b-86e4-b80882da454e","Type":"ContainerStarted","Data":"d2ff5316978632315c5e07e389d80deea95f8463c27c458a3137b27f0684a6a3"} Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.323535 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee","Type":"ContainerStarted","Data":"92854c07f11776cd9ac2f61cab36a7565089c9ec58d08137d9efcb0447965bfe"} Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.325267 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-z7wfg" event={"ID":"28451893-15ed-4dc1-a6ef-f93fed27316e","Type":"ContainerStarted","Data":"df939a439aea55a9b58f5e70c74d0ebc91b52ff37d1c0e8237aadbf6cd18fd0c"} Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.327448 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7edaf8ab-283b-46bc-89e2-a3c8f681624b","Type":"ContainerStarted","Data":"10ba7b674c78b59063bb1ccbacdf6fb036342617fda7764a42f722b6d1d65200"} Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.330827 4885 generic.go:334] "Generic (PLEG): container finished" podID="32c5b9a2-f65e-4223-ac3f-f49a4e160454" containerID="a3663469c670c736e5269d67152b5ed9953dfe0836aa12e16e52c26348f640fc" exitCode=0 Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.331413 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hgth4" event={"ID":"32c5b9a2-f65e-4223-ac3f-f49a4e160454","Type":"ContainerDied","Data":"a3663469c670c736e5269d67152b5ed9953dfe0836aa12e16e52c26348f640fc"} Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.339433 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjcr8\" (UniqueName: \"kubernetes.io/projected/9e19c4d9-3055-4b50-b37d-f02aab457b39-kube-api-access-sjcr8\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.339478 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e19c4d9-3055-4b50-b37d-f02aab457b39-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.339493 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e19c4d9-3055-4b50-b37d-f02aab457b39-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.353030 4885 scope.go:117] "RemoveContainer" containerID="38d2e52846fe76574b539bc78708594216223af70902d4bc7768aa703260f760" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.397267 4885 scope.go:117] "RemoveContainer" containerID="d2a054d7d8c8cc63a89fd2bbd0eb4a91d82f4c76a939b169e910a3a8209230f7" Dec 05 20:22:55 crc kubenswrapper[4885]: E1205 20:22:55.398544 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2a054d7d8c8cc63a89fd2bbd0eb4a91d82f4c76a939b169e910a3a8209230f7\": container with ID starting with d2a054d7d8c8cc63a89fd2bbd0eb4a91d82f4c76a939b169e910a3a8209230f7 not found: ID does not exist" containerID="d2a054d7d8c8cc63a89fd2bbd0eb4a91d82f4c76a939b169e910a3a8209230f7" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.398580 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2a054d7d8c8cc63a89fd2bbd0eb4a91d82f4c76a939b169e910a3a8209230f7"} err="failed to get container status \"d2a054d7d8c8cc63a89fd2bbd0eb4a91d82f4c76a939b169e910a3a8209230f7\": rpc error: code = NotFound desc = could not find container \"d2a054d7d8c8cc63a89fd2bbd0eb4a91d82f4c76a939b169e910a3a8209230f7\": container with ID starting with d2a054d7d8c8cc63a89fd2bbd0eb4a91d82f4c76a939b169e910a3a8209230f7 not found: ID does not exist" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.398608 4885 scope.go:117] "RemoveContainer" containerID="38d2e52846fe76574b539bc78708594216223af70902d4bc7768aa703260f760" Dec 05 20:22:55 crc kubenswrapper[4885]: E1205 20:22:55.399302 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38d2e52846fe76574b539bc78708594216223af70902d4bc7768aa703260f760\": container with ID starting with 38d2e52846fe76574b539bc78708594216223af70902d4bc7768aa703260f760 not found: ID does not exist" containerID="38d2e52846fe76574b539bc78708594216223af70902d4bc7768aa703260f760" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.399330 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38d2e52846fe76574b539bc78708594216223af70902d4bc7768aa703260f760"} err="failed to get container status \"38d2e52846fe76574b539bc78708594216223af70902d4bc7768aa703260f760\": rpc error: code = NotFound desc = could not find container \"38d2e52846fe76574b539bc78708594216223af70902d4bc7768aa703260f760\": container with ID starting with 38d2e52846fe76574b539bc78708594216223af70902d4bc7768aa703260f760 not found: ID does not exist" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.407329 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-z7wfg" podStartSLOduration=3.134143889 podStartE2EDuration="11.407312245s" podCreationTimestamp="2025-12-05 20:22:44 +0000 UTC" firstStartedPulling="2025-12-05 20:22:45.431388325 +0000 UTC m=+1030.728203976" lastFinishedPulling="2025-12-05 20:22:53.704556671 +0000 UTC m=+1039.001372332" observedRunningTime="2025-12-05 20:22:55.40522585 +0000 UTC m=+1040.702041531" watchObservedRunningTime="2025-12-05 20:22:55.407312245 +0000 UTC m=+1040.704127906" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.427417 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-whjvl"] Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.432387 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.433617 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-whjvl"] Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.451946 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.860608594 podStartE2EDuration="13.451928821s" podCreationTimestamp="2025-12-05 20:22:42 +0000 UTC" firstStartedPulling="2025-12-05 20:22:45.620443568 +0000 UTC m=+1030.917259229" lastFinishedPulling="2025-12-05 20:22:53.211763795 +0000 UTC m=+1038.508579456" observedRunningTime="2025-12-05 20:22:55.43817653 +0000 UTC m=+1040.734992191" watchObservedRunningTime="2025-12-05 20:22:55.451928821 +0000 UTC m=+1040.748744482" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.464423 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.731772404 podStartE2EDuration="10.464406091s" podCreationTimestamp="2025-12-05 20:22:45 +0000 UTC" firstStartedPulling="2025-12-05 20:22:51.984546099 +0000 UTC m=+1037.281361760" lastFinishedPulling="2025-12-05 20:22:53.717179786 +0000 UTC m=+1039.013995447" observedRunningTime="2025-12-05 20:22:55.454070917 +0000 UTC m=+1040.750886578" watchObservedRunningTime="2025-12-05 20:22:55.464406091 +0000 UTC m=+1040.761221752" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.684977 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-846f75bbfc-6ldzw"] Dec 05 20:22:55 crc kubenswrapper[4885]: E1205 20:22:55.685324 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e19c4d9-3055-4b50-b37d-f02aab457b39" containerName="init" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.685341 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e19c4d9-3055-4b50-b37d-f02aab457b39" containerName="init" Dec 05 20:22:55 crc kubenswrapper[4885]: E1205 20:22:55.685354 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e19c4d9-3055-4b50-b37d-f02aab457b39" containerName="dnsmasq-dns" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.685361 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e19c4d9-3055-4b50-b37d-f02aab457b39" containerName="dnsmasq-dns" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.685500 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e19c4d9-3055-4b50-b37d-f02aab457b39" containerName="dnsmasq-dns" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.686259 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.688801 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.748138 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-846f75bbfc-6ldzw"] Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.848581 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-ovsdbserver-sb\") pod \"dnsmasq-dns-846f75bbfc-6ldzw\" (UID: \"7bb5651c-f1ea-48db-9916-628e21c0780e\") " pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.848641 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-dns-svc\") pod \"dnsmasq-dns-846f75bbfc-6ldzw\" (UID: \"7bb5651c-f1ea-48db-9916-628e21c0780e\") " pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.848681 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc5t5\" (UniqueName: \"kubernetes.io/projected/7bb5651c-f1ea-48db-9916-628e21c0780e-kube-api-access-lc5t5\") pod \"dnsmasq-dns-846f75bbfc-6ldzw\" (UID: \"7bb5651c-f1ea-48db-9916-628e21c0780e\") " pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.848874 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-config\") pod \"dnsmasq-dns-846f75bbfc-6ldzw\" (UID: \"7bb5651c-f1ea-48db-9916-628e21c0780e\") " pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.872723 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-846f75bbfc-6ldzw"] Dec 05 20:22:55 crc kubenswrapper[4885]: E1205 20:22:55.873327 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-lc5t5 ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" podUID="7bb5651c-f1ea-48db-9916-628e21c0780e" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.891925 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-gh96d"] Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.895302 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.898499 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.908283 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-gh96d"] Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.949983 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-config\") pod \"dnsmasq-dns-846f75bbfc-6ldzw\" (UID: \"7bb5651c-f1ea-48db-9916-628e21c0780e\") " pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.950473 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-ovsdbserver-sb\") pod \"dnsmasq-dns-846f75bbfc-6ldzw\" (UID: \"7bb5651c-f1ea-48db-9916-628e21c0780e\") " pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.950518 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-dns-svc\") pod \"dnsmasq-dns-846f75bbfc-6ldzw\" (UID: \"7bb5651c-f1ea-48db-9916-628e21c0780e\") " pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.950558 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc5t5\" (UniqueName: \"kubernetes.io/projected/7bb5651c-f1ea-48db-9916-628e21c0780e-kube-api-access-lc5t5\") pod \"dnsmasq-dns-846f75bbfc-6ldzw\" (UID: \"7bb5651c-f1ea-48db-9916-628e21c0780e\") " pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.951118 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-config\") pod \"dnsmasq-dns-846f75bbfc-6ldzw\" (UID: \"7bb5651c-f1ea-48db-9916-628e21c0780e\") " pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.951752 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-ovsdbserver-sb\") pod \"dnsmasq-dns-846f75bbfc-6ldzw\" (UID: \"7bb5651c-f1ea-48db-9916-628e21c0780e\") " pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.952368 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-dns-svc\") pod \"dnsmasq-dns-846f75bbfc-6ldzw\" (UID: \"7bb5651c-f1ea-48db-9916-628e21c0780e\") " pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" Dec 05 20:22:55 crc kubenswrapper[4885]: I1205 20:22:55.967787 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc5t5\" (UniqueName: \"kubernetes.io/projected/7bb5651c-f1ea-48db-9916-628e21c0780e-kube-api-access-lc5t5\") pod \"dnsmasq-dns-846f75bbfc-6ldzw\" (UID: \"7bb5651c-f1ea-48db-9916-628e21c0780e\") " pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.052422 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-config\") pod \"dnsmasq-dns-984c76dd7-gh96d\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.052484 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-ovsdbserver-sb\") pod \"dnsmasq-dns-984c76dd7-gh96d\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.052593 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-ovsdbserver-nb\") pod \"dnsmasq-dns-984c76dd7-gh96d\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.052653 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxhwr\" (UniqueName: \"kubernetes.io/projected/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-kube-api-access-rxhwr\") pod \"dnsmasq-dns-984c76dd7-gh96d\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.052712 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-dns-svc\") pod \"dnsmasq-dns-984c76dd7-gh96d\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.154062 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxhwr\" (UniqueName: \"kubernetes.io/projected/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-kube-api-access-rxhwr\") pod \"dnsmasq-dns-984c76dd7-gh96d\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.154137 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-dns-svc\") pod \"dnsmasq-dns-984c76dd7-gh96d\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.154182 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-config\") pod \"dnsmasq-dns-984c76dd7-gh96d\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.154250 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-ovsdbserver-sb\") pod \"dnsmasq-dns-984c76dd7-gh96d\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.154311 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-ovsdbserver-nb\") pod \"dnsmasq-dns-984c76dd7-gh96d\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.155415 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-ovsdbserver-sb\") pod \"dnsmasq-dns-984c76dd7-gh96d\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.155578 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-config\") pod \"dnsmasq-dns-984c76dd7-gh96d\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.155641 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-dns-svc\") pod \"dnsmasq-dns-984c76dd7-gh96d\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.155812 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-ovsdbserver-nb\") pod \"dnsmasq-dns-984c76dd7-gh96d\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.171055 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxhwr\" (UniqueName: \"kubernetes.io/projected/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-kube-api-access-rxhwr\") pod \"dnsmasq-dns-984c76dd7-gh96d\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.216237 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.351786 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hgth4" event={"ID":"32c5b9a2-f65e-4223-ac3f-f49a4e160454","Type":"ContainerStarted","Data":"e433fc43a1cba16facead501441b56cf216e6aa334daa2ec3a51b26283ca236a"} Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.354160 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.375096 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.431964 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.459735 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-config\") pod \"7bb5651c-f1ea-48db-9916-628e21c0780e\" (UID: \"7bb5651c-f1ea-48db-9916-628e21c0780e\") " Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.459918 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-ovsdbserver-sb\") pod \"7bb5651c-f1ea-48db-9916-628e21c0780e\" (UID: \"7bb5651c-f1ea-48db-9916-628e21c0780e\") " Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.459966 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-dns-svc\") pod \"7bb5651c-f1ea-48db-9916-628e21c0780e\" (UID: \"7bb5651c-f1ea-48db-9916-628e21c0780e\") " Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.459993 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc5t5\" (UniqueName: \"kubernetes.io/projected/7bb5651c-f1ea-48db-9916-628e21c0780e-kube-api-access-lc5t5\") pod \"7bb5651c-f1ea-48db-9916-628e21c0780e\" (UID: \"7bb5651c-f1ea-48db-9916-628e21c0780e\") " Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.461000 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-config" (OuterVolumeSpecName: "config") pod "7bb5651c-f1ea-48db-9916-628e21c0780e" (UID: "7bb5651c-f1ea-48db-9916-628e21c0780e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.461268 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7bb5651c-f1ea-48db-9916-628e21c0780e" (UID: "7bb5651c-f1ea-48db-9916-628e21c0780e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.461780 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7bb5651c-f1ea-48db-9916-628e21c0780e" (UID: "7bb5651c-f1ea-48db-9916-628e21c0780e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.466265 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb5651c-f1ea-48db-9916-628e21c0780e-kube-api-access-lc5t5" (OuterVolumeSpecName: "kube-api-access-lc5t5") pod "7bb5651c-f1ea-48db-9916-628e21c0780e" (UID: "7bb5651c-f1ea-48db-9916-628e21c0780e"). InnerVolumeSpecName "kube-api-access-lc5t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.562081 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.562122 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc5t5\" (UniqueName: \"kubernetes.io/projected/7bb5651c-f1ea-48db-9916-628e21c0780e-kube-api-access-lc5t5\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.562135 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.562157 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bb5651c-f1ea-48db-9916-628e21c0780e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.603808 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:56 crc kubenswrapper[4885]: I1205 20:22:56.643519 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-gh96d"] Dec 05 20:22:57 crc kubenswrapper[4885]: I1205 20:22:57.183781 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e19c4d9-3055-4b50-b37d-f02aab457b39" path="/var/lib/kubelet/pods/9e19c4d9-3055-4b50-b37d-f02aab457b39/volumes" Dec 05 20:22:57 crc kubenswrapper[4885]: I1205 20:22:57.363071 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-846f75bbfc-6ldzw" Dec 05 20:22:57 crc kubenswrapper[4885]: I1205 20:22:57.363142 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-984c76dd7-gh96d" event={"ID":"89ebc5ff-11c5-4724-bcbf-60a4eb26508b","Type":"ContainerStarted","Data":"5cde5dd84694cfe70ca05c1cb175ae4b6ac29718804f74bfde16232279ed46f7"} Dec 05 20:22:57 crc kubenswrapper[4885]: I1205 20:22:57.823941 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:58 crc kubenswrapper[4885]: I1205 20:22:58.053578 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-846f75bbfc-6ldzw"] Dec 05 20:22:58 crc kubenswrapper[4885]: I1205 20:22:58.058950 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-846f75bbfc-6ldzw"] Dec 05 20:22:58 crc kubenswrapper[4885]: I1205 20:22:58.372456 4885 generic.go:334] "Generic (PLEG): container finished" podID="89ebc5ff-11c5-4724-bcbf-60a4eb26508b" containerID="69debaac6aa9698c99bb2034740ecb25c839548c2eceffba8c22009a2c25981d" exitCode=0 Dec 05 20:22:58 crc kubenswrapper[4885]: I1205 20:22:58.373534 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-984c76dd7-gh96d" event={"ID":"89ebc5ff-11c5-4724-bcbf-60a4eb26508b","Type":"ContainerDied","Data":"69debaac6aa9698c99bb2034740ecb25c839548c2eceffba8c22009a2c25981d"} Dec 05 20:22:58 crc kubenswrapper[4885]: I1205 20:22:58.376722 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hgth4" event={"ID":"32c5b9a2-f65e-4223-ac3f-f49a4e160454","Type":"ContainerStarted","Data":"9bd66c590a51268836dcba268c265d1c51655a257ba275457dbca92c51c9ffcf"} Dec 05 20:22:58 crc kubenswrapper[4885]: I1205 20:22:58.376767 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:58 crc kubenswrapper[4885]: I1205 20:22:58.376778 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:22:58 crc kubenswrapper[4885]: I1205 20:22:58.382716 4885 generic.go:334] "Generic (PLEG): container finished" podID="3e1a8619-8184-43c1-9444-8e86fbc4213d" containerID="d66b598e3afa3baf34bce11d842b381f0da16761713f453e92c12706b9728797" exitCode=0 Dec 05 20:22:58 crc kubenswrapper[4885]: I1205 20:22:58.382997 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3e1a8619-8184-43c1-9444-8e86fbc4213d","Type":"ContainerDied","Data":"d66b598e3afa3baf34bce11d842b381f0da16761713f453e92c12706b9728797"} Dec 05 20:22:58 crc kubenswrapper[4885]: I1205 20:22:58.383055 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:58 crc kubenswrapper[4885]: I1205 20:22:58.434131 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 05 20:22:58 crc kubenswrapper[4885]: I1205 20:22:58.445368 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-hgth4" podStartSLOduration=8.143289229 podStartE2EDuration="15.445352005s" podCreationTimestamp="2025-12-05 20:22:43 +0000 UTC" firstStartedPulling="2025-12-05 20:22:44.318737516 +0000 UTC m=+1029.615553177" lastFinishedPulling="2025-12-05 20:22:51.620800262 +0000 UTC m=+1036.917615953" observedRunningTime="2025-12-05 20:22:58.440565405 +0000 UTC m=+1043.737381076" watchObservedRunningTime="2025-12-05 20:22:58.445352005 +0000 UTC m=+1043.742167666" Dec 05 20:22:58 crc kubenswrapper[4885]: I1205 20:22:58.487202 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.182117 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb5651c-f1ea-48db-9916-628e21c0780e" path="/var/lib/kubelet/pods/7bb5651c-f1ea-48db-9916-628e21c0780e/volumes" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.393890 4885 generic.go:334] "Generic (PLEG): container finished" podID="93184776-73bf-4ff3-9f7f-66b46fd511ed" containerID="d4b55fdb38be1516407bacc8225f4710b58caff0de9e12a6c3a4ba1e38f1a1ce" exitCode=0 Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.394255 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"93184776-73bf-4ff3-9f7f-66b46fd511ed","Type":"ContainerDied","Data":"d4b55fdb38be1516407bacc8225f4710b58caff0de9e12a6c3a4ba1e38f1a1ce"} Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.397526 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-984c76dd7-gh96d" event={"ID":"89ebc5ff-11c5-4724-bcbf-60a4eb26508b","Type":"ContainerStarted","Data":"9c3fe83b737d64eb72724dfe898171da23229e29f5b64d26e01e947ff8a0a044"} Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.404249 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3e1a8619-8184-43c1-9444-8e86fbc4213d","Type":"ContainerStarted","Data":"28fc3a56b76c9e01a51ca5839e0276cdbdf2687102d532220ec56b7d7c01505b"} Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.453921 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.462729 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-984c76dd7-gh96d" podStartSLOduration=4.462703754 podStartE2EDuration="4.462703754s" podCreationTimestamp="2025-12-05 20:22:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:22:59.45715565 +0000 UTC m=+1044.753971351" watchObservedRunningTime="2025-12-05 20:22:59.462703754 +0000 UTC m=+1044.759519425" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.490362 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=18.534402698 podStartE2EDuration="26.490334729s" podCreationTimestamp="2025-12-05 20:22:33 +0000 UTC" firstStartedPulling="2025-12-05 20:22:43.647451557 +0000 UTC m=+1028.944267218" lastFinishedPulling="2025-12-05 20:22:51.603383588 +0000 UTC m=+1036.900199249" observedRunningTime="2025-12-05 20:22:59.473929845 +0000 UTC m=+1044.770745516" watchObservedRunningTime="2025-12-05 20:22:59.490334729 +0000 UTC m=+1044.787150410" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.638456 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.640649 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.645918 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.658480 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-ss8s9" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.658774 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.661065 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.661337 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.738483 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cf8581f-1009-4a26-9642-4e154e83dbc1-config\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.738760 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf8581f-1009-4a26-9642-4e154e83dbc1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.738793 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2cf8581f-1009-4a26-9642-4e154e83dbc1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.738815 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf8581f-1009-4a26-9642-4e154e83dbc1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.738857 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h79zx\" (UniqueName: \"kubernetes.io/projected/2cf8581f-1009-4a26-9642-4e154e83dbc1-kube-api-access-h79zx\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.738873 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2cf8581f-1009-4a26-9642-4e154e83dbc1-scripts\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.738914 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf8581f-1009-4a26-9642-4e154e83dbc1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.840459 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cf8581f-1009-4a26-9642-4e154e83dbc1-config\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.840507 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf8581f-1009-4a26-9642-4e154e83dbc1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.840537 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2cf8581f-1009-4a26-9642-4e154e83dbc1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.840560 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf8581f-1009-4a26-9642-4e154e83dbc1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.840601 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h79zx\" (UniqueName: \"kubernetes.io/projected/2cf8581f-1009-4a26-9642-4e154e83dbc1-kube-api-access-h79zx\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.840618 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2cf8581f-1009-4a26-9642-4e154e83dbc1-scripts\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.840658 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf8581f-1009-4a26-9642-4e154e83dbc1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.841475 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cf8581f-1009-4a26-9642-4e154e83dbc1-config\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.841673 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2cf8581f-1009-4a26-9642-4e154e83dbc1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.842543 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2cf8581f-1009-4a26-9642-4e154e83dbc1-scripts\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.845768 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf8581f-1009-4a26-9642-4e154e83dbc1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.845842 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf8581f-1009-4a26-9642-4e154e83dbc1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.847626 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf8581f-1009-4a26-9642-4e154e83dbc1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.857312 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h79zx\" (UniqueName: \"kubernetes.io/projected/2cf8581f-1009-4a26-9642-4e154e83dbc1-kube-api-access-h79zx\") pod \"ovn-northd-0\" (UID: \"2cf8581f-1009-4a26-9642-4e154e83dbc1\") " pod="openstack/ovn-northd-0" Dec 05 20:22:59 crc kubenswrapper[4885]: I1205 20:22:59.974200 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 20:23:00 crc kubenswrapper[4885]: I1205 20:23:00.403810 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 20:23:00 crc kubenswrapper[4885]: W1205 20:23:00.418612 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cf8581f_1009_4a26_9642_4e154e83dbc1.slice/crio-babcd0daf1542799628bdef85771b6ee76c76caf47b301192316a3363fa930fa WatchSource:0}: Error finding container babcd0daf1542799628bdef85771b6ee76c76caf47b301192316a3363fa930fa: Status 404 returned error can't find the container with id babcd0daf1542799628bdef85771b6ee76c76caf47b301192316a3363fa930fa Dec 05 20:23:00 crc kubenswrapper[4885]: I1205 20:23:00.429831 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"93184776-73bf-4ff3-9f7f-66b46fd511ed","Type":"ContainerStarted","Data":"07ef3b2cd604ef2bee268931568c6b0eccbec1d8a1b311122d29e856fab77467"} Dec 05 20:23:00 crc kubenswrapper[4885]: I1205 20:23:00.431338 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:23:00 crc kubenswrapper[4885]: I1205 20:23:00.466567 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.668002735 podStartE2EDuration="26.466548772s" podCreationTimestamp="2025-12-05 20:22:34 +0000 UTC" firstStartedPulling="2025-12-05 20:22:43.823520624 +0000 UTC m=+1029.120336285" lastFinishedPulling="2025-12-05 20:22:51.622066661 +0000 UTC m=+1036.918882322" observedRunningTime="2025-12-05 20:23:00.454753342 +0000 UTC m=+1045.751569043" watchObservedRunningTime="2025-12-05 20:23:00.466548772 +0000 UTC m=+1045.763364433" Dec 05 20:23:01 crc kubenswrapper[4885]: I1205 20:23:01.443184 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2cf8581f-1009-4a26-9642-4e154e83dbc1","Type":"ContainerStarted","Data":"babcd0daf1542799628bdef85771b6ee76c76caf47b301192316a3363fa930fa"} Dec 05 20:23:01 crc kubenswrapper[4885]: I1205 20:23:01.819154 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 05 20:23:02 crc kubenswrapper[4885]: I1205 20:23:02.453089 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2cf8581f-1009-4a26-9642-4e154e83dbc1","Type":"ContainerStarted","Data":"c797bd44de4406156bbdad47f05fad6e9d640c7c5ed24ef9ff6b6c56ef0f7f44"} Dec 05 20:23:02 crc kubenswrapper[4885]: I1205 20:23:02.453138 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2cf8581f-1009-4a26-9642-4e154e83dbc1","Type":"ContainerStarted","Data":"8aa03108c83a386a243b62be7a92a7a18cc521caa1652a457c9f28e82c81aca0"} Dec 05 20:23:02 crc kubenswrapper[4885]: I1205 20:23:02.453263 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 05 20:23:02 crc kubenswrapper[4885]: I1205 20:23:02.477075 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.4929746059999998 podStartE2EDuration="3.477056514s" podCreationTimestamp="2025-12-05 20:22:59 +0000 UTC" firstStartedPulling="2025-12-05 20:23:00.421694079 +0000 UTC m=+1045.718509750" lastFinishedPulling="2025-12-05 20:23:01.405775997 +0000 UTC m=+1046.702591658" observedRunningTime="2025-12-05 20:23:02.475271607 +0000 UTC m=+1047.772087278" watchObservedRunningTime="2025-12-05 20:23:02.477056514 +0000 UTC m=+1047.773872175" Dec 05 20:23:04 crc kubenswrapper[4885]: I1205 20:23:04.864733 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 05 20:23:04 crc kubenswrapper[4885]: I1205 20:23:04.865119 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 05 20:23:06 crc kubenswrapper[4885]: I1205 20:23:06.219040 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:23:06 crc kubenswrapper[4885]: I1205 20:23:06.253209 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 05 20:23:06 crc kubenswrapper[4885]: I1205 20:23:06.253358 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 05 20:23:06 crc kubenswrapper[4885]: I1205 20:23:06.275817 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-sg9cr"] Dec 05 20:23:06 crc kubenswrapper[4885]: I1205 20:23:06.276084 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cb666b895-sg9cr" podUID="2522c28b-c324-408d-a8b7-7e3d83709a6a" containerName="dnsmasq-dns" containerID="cri-o://3cf8e1c6e6b8c7fd66373a16e2bfbbe3451a3a140dd01dc3ecaf3f487d76527e" gracePeriod=10 Dec 05 20:23:06 crc kubenswrapper[4885]: I1205 20:23:06.483876 4885 generic.go:334] "Generic (PLEG): container finished" podID="2522c28b-c324-408d-a8b7-7e3d83709a6a" containerID="3cf8e1c6e6b8c7fd66373a16e2bfbbe3451a3a140dd01dc3ecaf3f487d76527e" exitCode=0 Dec 05 20:23:06 crc kubenswrapper[4885]: I1205 20:23:06.483943 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-sg9cr" event={"ID":"2522c28b-c324-408d-a8b7-7e3d83709a6a","Type":"ContainerDied","Data":"3cf8e1c6e6b8c7fd66373a16e2bfbbe3451a3a140dd01dc3ecaf3f487d76527e"} Dec 05 20:23:07 crc kubenswrapper[4885]: I1205 20:23:07.253778 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cb666b895-sg9cr" podUID="2522c28b-c324-408d-a8b7-7e3d83709a6a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.97:5353: connect: connection refused" Dec 05 20:23:07 crc kubenswrapper[4885]: I1205 20:23:07.522032 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 05 20:23:07 crc kubenswrapper[4885]: I1205 20:23:07.603610 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="3e1a8619-8184-43c1-9444-8e86fbc4213d" containerName="galera" probeResult="failure" output=< Dec 05 20:23:07 crc kubenswrapper[4885]: wsrep_local_state_comment (Joined) differs from Synced Dec 05 20:23:07 crc kubenswrapper[4885]: > Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.621098 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.724840 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-784d65c867-xkgrp"] Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.726729 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.758677 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-784d65c867-xkgrp"] Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.793216 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-dns-svc\") pod \"dnsmasq-dns-784d65c867-xkgrp\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.793323 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-ovsdbserver-sb\") pod \"dnsmasq-dns-784d65c867-xkgrp\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.793364 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-ovsdbserver-nb\") pod \"dnsmasq-dns-784d65c867-xkgrp\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.793420 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dd6p\" (UniqueName: \"kubernetes.io/projected/92f8ad64-3f8e-462a-91ae-091750185877-kube-api-access-9dd6p\") pod \"dnsmasq-dns-784d65c867-xkgrp\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.793466 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-config\") pod \"dnsmasq-dns-784d65c867-xkgrp\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.895081 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-ovsdbserver-nb\") pod \"dnsmasq-dns-784d65c867-xkgrp\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.895833 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-ovsdbserver-nb\") pod \"dnsmasq-dns-784d65c867-xkgrp\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.895918 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dd6p\" (UniqueName: \"kubernetes.io/projected/92f8ad64-3f8e-462a-91ae-091750185877-kube-api-access-9dd6p\") pod \"dnsmasq-dns-784d65c867-xkgrp\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.896205 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-config\") pod \"dnsmasq-dns-784d65c867-xkgrp\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.896749 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-config\") pod \"dnsmasq-dns-784d65c867-xkgrp\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.896784 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-dns-svc\") pod \"dnsmasq-dns-784d65c867-xkgrp\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.896846 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-ovsdbserver-sb\") pod \"dnsmasq-dns-784d65c867-xkgrp\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.896927 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-dns-svc\") pod \"dnsmasq-dns-784d65c867-xkgrp\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.897382 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-ovsdbserver-sb\") pod \"dnsmasq-dns-784d65c867-xkgrp\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:08 crc kubenswrapper[4885]: I1205 20:23:08.914953 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dd6p\" (UniqueName: \"kubernetes.io/projected/92f8ad64-3f8e-462a-91ae-091750185877-kube-api-access-9dd6p\") pod \"dnsmasq-dns-784d65c867-xkgrp\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:09 crc kubenswrapper[4885]: I1205 20:23:09.055202 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:09 crc kubenswrapper[4885]: I1205 20:23:09.473123 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-784d65c867-xkgrp"] Dec 05 20:23:09 crc kubenswrapper[4885]: W1205 20:23:09.477552 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92f8ad64_3f8e_462a_91ae_091750185877.slice/crio-4d93e6acfd361e65bc0cfe4c2d8d2400439dd166902e700a67d0c542afaf5b5e WatchSource:0}: Error finding container 4d93e6acfd361e65bc0cfe4c2d8d2400439dd166902e700a67d0c542afaf5b5e: Status 404 returned error can't find the container with id 4d93e6acfd361e65bc0cfe4c2d8d2400439dd166902e700a67d0c542afaf5b5e Dec 05 20:23:09 crc kubenswrapper[4885]: I1205 20:23:09.508308 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784d65c867-xkgrp" event={"ID":"92f8ad64-3f8e-462a-91ae-091750185877","Type":"ContainerStarted","Data":"4d93e6acfd361e65bc0cfe4c2d8d2400439dd166902e700a67d0c542afaf5b5e"} Dec 05 20:23:09 crc kubenswrapper[4885]: I1205 20:23:09.819992 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 05 20:23:09 crc kubenswrapper[4885]: I1205 20:23:09.826597 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 20:23:09 crc kubenswrapper[4885]: I1205 20:23:09.831351 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 05 20:23:09 crc kubenswrapper[4885]: I1205 20:23:09.831638 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-d7rp4" Dec 05 20:23:09 crc kubenswrapper[4885]: I1205 20:23:09.831797 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 05 20:23:09 crc kubenswrapper[4885]: I1205 20:23:09.832264 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 05 20:23:09 crc kubenswrapper[4885]: I1205 20:23:09.845610 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 20:23:09 crc kubenswrapper[4885]: I1205 20:23:09.911937 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:09 crc kubenswrapper[4885]: I1205 20:23:09.912108 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/18b127df-3095-45b6-b347-f1906d6317fe-lock\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:09 crc kubenswrapper[4885]: I1205 20:23:09.912180 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:09 crc kubenswrapper[4885]: I1205 20:23:09.912205 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/18b127df-3095-45b6-b347-f1906d6317fe-cache\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:09 crc kubenswrapper[4885]: I1205 20:23:09.912274 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jht6z\" (UniqueName: \"kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-kube-api-access-jht6z\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:09 crc kubenswrapper[4885]: E1205 20:23:09.912439 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92f8ad64_3f8e_462a_91ae_091750185877.slice/crio-b8889bab38c5da09490113c402d40377ed6f9d4d7ee561b2e00150e4e9cf6f74.scope\": RecentStats: unable to find data in memory cache]" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.014171 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jht6z\" (UniqueName: \"kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-kube-api-access-jht6z\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.014285 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.014329 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/18b127df-3095-45b6-b347-f1906d6317fe-lock\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.014362 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.014381 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/18b127df-3095-45b6-b347-f1906d6317fe-cache\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:10 crc kubenswrapper[4885]: E1205 20:23:10.014502 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 20:23:10 crc kubenswrapper[4885]: E1205 20:23:10.014534 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 20:23:10 crc kubenswrapper[4885]: E1205 20:23:10.014583 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift podName:18b127df-3095-45b6-b347-f1906d6317fe nodeName:}" failed. No retries permitted until 2025-12-05 20:23:10.514567432 +0000 UTC m=+1055.811383093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift") pod "swift-storage-0" (UID: "18b127df-3095-45b6-b347-f1906d6317fe") : configmap "swift-ring-files" not found Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.014851 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/18b127df-3095-45b6-b347-f1906d6317fe-lock\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.014978 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.016104 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/18b127df-3095-45b6-b347-f1906d6317fe-cache\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.034953 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jht6z\" (UniqueName: \"kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-kube-api-access-jht6z\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.036860 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.093397 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vg7pf"] Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.094400 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.095759 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.096105 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.096257 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.115060 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-sg9cr" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.120154 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.140441 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-vg7pf"] Dec 05 20:23:10 crc kubenswrapper[4885]: E1205 20:23:10.141208 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-8xnff ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-8xnff ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-vg7pf" podUID="befc3d65-441e-4be1-9e02-7a5183e1a2d0" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.151306 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-2j6cb"] Dec 05 20:23:10 crc kubenswrapper[4885]: E1205 20:23:10.151733 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2522c28b-c324-408d-a8b7-7e3d83709a6a" containerName="init" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.151748 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2522c28b-c324-408d-a8b7-7e3d83709a6a" containerName="init" Dec 05 20:23:10 crc kubenswrapper[4885]: E1205 20:23:10.151760 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2522c28b-c324-408d-a8b7-7e3d83709a6a" containerName="dnsmasq-dns" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.151767 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2522c28b-c324-408d-a8b7-7e3d83709a6a" containerName="dnsmasq-dns" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.151950 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2522c28b-c324-408d-a8b7-7e3d83709a6a" containerName="dnsmasq-dns" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.152591 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.201080 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2j6cb"] Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.216871 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2522c28b-c324-408d-a8b7-7e3d83709a6a-dns-svc\") pod \"2522c28b-c324-408d-a8b7-7e3d83709a6a\" (UID: \"2522c28b-c324-408d-a8b7-7e3d83709a6a\") " Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.216916 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tzxj\" (UniqueName: \"kubernetes.io/projected/2522c28b-c324-408d-a8b7-7e3d83709a6a-kube-api-access-8tzxj\") pod \"2522c28b-c324-408d-a8b7-7e3d83709a6a\" (UID: \"2522c28b-c324-408d-a8b7-7e3d83709a6a\") " Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.216953 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2522c28b-c324-408d-a8b7-7e3d83709a6a-config\") pod \"2522c28b-c324-408d-a8b7-7e3d83709a6a\" (UID: \"2522c28b-c324-408d-a8b7-7e3d83709a6a\") " Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.217158 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5c452f6-0d03-4e67-bab0-0dcb1926f523-scripts\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.217183 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-swiftconf\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.217219 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-combined-ca-bundle\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.217249 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-combined-ca-bundle\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.217311 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/befc3d65-441e-4be1-9e02-7a5183e1a2d0-ring-data-devices\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.217350 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-dispersionconf\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.217367 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/befc3d65-441e-4be1-9e02-7a5183e1a2d0-scripts\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.217402 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-dispersionconf\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.217417 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/befc3d65-441e-4be1-9e02-7a5183e1a2d0-etc-swift\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.217433 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5c452f6-0d03-4e67-bab0-0dcb1926f523-ring-data-devices\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.217465 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-swiftconf\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.217491 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtz8c\" (UniqueName: \"kubernetes.io/projected/c5c452f6-0d03-4e67-bab0-0dcb1926f523-kube-api-access-xtz8c\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.217528 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xnff\" (UniqueName: \"kubernetes.io/projected/befc3d65-441e-4be1-9e02-7a5183e1a2d0-kube-api-access-8xnff\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.217567 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5c452f6-0d03-4e67-bab0-0dcb1926f523-etc-swift\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.219608 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-vg7pf"] Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.238108 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2522c28b-c324-408d-a8b7-7e3d83709a6a-kube-api-access-8tzxj" (OuterVolumeSpecName: "kube-api-access-8tzxj") pod "2522c28b-c324-408d-a8b7-7e3d83709a6a" (UID: "2522c28b-c324-408d-a8b7-7e3d83709a6a"). InnerVolumeSpecName "kube-api-access-8tzxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.253515 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.260070 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2522c28b-c324-408d-a8b7-7e3d83709a6a-config" (OuterVolumeSpecName: "config") pod "2522c28b-c324-408d-a8b7-7e3d83709a6a" (UID: "2522c28b-c324-408d-a8b7-7e3d83709a6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.263662 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2522c28b-c324-408d-a8b7-7e3d83709a6a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2522c28b-c324-408d-a8b7-7e3d83709a6a" (UID: "2522c28b-c324-408d-a8b7-7e3d83709a6a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.319758 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtz8c\" (UniqueName: \"kubernetes.io/projected/c5c452f6-0d03-4e67-bab0-0dcb1926f523-kube-api-access-xtz8c\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.319805 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xnff\" (UniqueName: \"kubernetes.io/projected/befc3d65-441e-4be1-9e02-7a5183e1a2d0-kube-api-access-8xnff\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.319833 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5c452f6-0d03-4e67-bab0-0dcb1926f523-etc-swift\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.319864 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5c452f6-0d03-4e67-bab0-0dcb1926f523-scripts\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.319885 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-swiftconf\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.319907 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-combined-ca-bundle\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.319935 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-combined-ca-bundle\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.319976 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/befc3d65-441e-4be1-9e02-7a5183e1a2d0-ring-data-devices\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.319997 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-dispersionconf\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.320030 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/befc3d65-441e-4be1-9e02-7a5183e1a2d0-scripts\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.320056 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-dispersionconf\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.320071 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/befc3d65-441e-4be1-9e02-7a5183e1a2d0-etc-swift\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.320085 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5c452f6-0d03-4e67-bab0-0dcb1926f523-ring-data-devices\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.320110 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-swiftconf\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.320163 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2522c28b-c324-408d-a8b7-7e3d83709a6a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.320174 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tzxj\" (UniqueName: \"kubernetes.io/projected/2522c28b-c324-408d-a8b7-7e3d83709a6a-kube-api-access-8tzxj\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.320183 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2522c28b-c324-408d-a8b7-7e3d83709a6a-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.321070 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/befc3d65-441e-4be1-9e02-7a5183e1a2d0-etc-swift\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.321233 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/befc3d65-441e-4be1-9e02-7a5183e1a2d0-ring-data-devices\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.321292 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/befc3d65-441e-4be1-9e02-7a5183e1a2d0-scripts\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.321464 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5c452f6-0d03-4e67-bab0-0dcb1926f523-scripts\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.321475 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5c452f6-0d03-4e67-bab0-0dcb1926f523-etc-swift\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.321969 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5c452f6-0d03-4e67-bab0-0dcb1926f523-ring-data-devices\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.322902 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-swiftconf\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.323342 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-combined-ca-bundle\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.324353 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-swiftconf\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.324740 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-dispersionconf\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.330958 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-dispersionconf\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.336654 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-combined-ca-bundle\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.339150 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtz8c\" (UniqueName: \"kubernetes.io/projected/c5c452f6-0d03-4e67-bab0-0dcb1926f523-kube-api-access-xtz8c\") pod \"swift-ring-rebalance-2j6cb\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.339511 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xnff\" (UniqueName: \"kubernetes.io/projected/befc3d65-441e-4be1-9e02-7a5183e1a2d0-kube-api-access-8xnff\") pod \"swift-ring-rebalance-vg7pf\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.478227 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.519673 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-sg9cr" event={"ID":"2522c28b-c324-408d-a8b7-7e3d83709a6a","Type":"ContainerDied","Data":"96f85ac3ec9d703566c091de6c7c6f6d574049ceb4f57a346b377e201a43cc22"} Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.520198 4885 scope.go:117] "RemoveContainer" containerID="3cf8e1c6e6b8c7fd66373a16e2bfbbe3451a3a140dd01dc3ecaf3f487d76527e" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.520205 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-sg9cr" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.522590 4885 generic.go:334] "Generic (PLEG): container finished" podID="92f8ad64-3f8e-462a-91ae-091750185877" containerID="b8889bab38c5da09490113c402d40377ed6f9d4d7ee561b2e00150e4e9cf6f74" exitCode=0 Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.522642 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784d65c867-xkgrp" event={"ID":"92f8ad64-3f8e-462a-91ae-091750185877","Type":"ContainerDied","Data":"b8889bab38c5da09490113c402d40377ed6f9d4d7ee561b2e00150e4e9cf6f74"} Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.522763 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.523782 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:10 crc kubenswrapper[4885]: E1205 20:23:10.524153 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 20:23:10 crc kubenswrapper[4885]: E1205 20:23:10.524177 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 20:23:10 crc kubenswrapper[4885]: E1205 20:23:10.524215 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift podName:18b127df-3095-45b6-b347-f1906d6317fe nodeName:}" failed. No retries permitted until 2025-12-05 20:23:11.524202772 +0000 UTC m=+1056.821018433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift") pod "swift-storage-0" (UID: "18b127df-3095-45b6-b347-f1906d6317fe") : configmap "swift-ring-files" not found Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.544956 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.579606 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-sg9cr"] Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.582849 4885 scope.go:117] "RemoveContainer" containerID="c0e0b6a7d108f7e0afb543795b1f8f317ad869723d8c3fbb2e4dd8d6a98eab0f" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.598891 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-sg9cr"] Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.625156 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-combined-ca-bundle\") pod \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.625213 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-swiftconf\") pod \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.625269 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/befc3d65-441e-4be1-9e02-7a5183e1a2d0-ring-data-devices\") pod \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.625306 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/befc3d65-441e-4be1-9e02-7a5183e1a2d0-scripts\") pod \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.625333 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/befc3d65-441e-4be1-9e02-7a5183e1a2d0-etc-swift\") pod \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.625386 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xnff\" (UniqueName: \"kubernetes.io/projected/befc3d65-441e-4be1-9e02-7a5183e1a2d0-kube-api-access-8xnff\") pod \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.625468 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-dispersionconf\") pod \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\" (UID: \"befc3d65-441e-4be1-9e02-7a5183e1a2d0\") " Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.627620 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/befc3d65-441e-4be1-9e02-7a5183e1a2d0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "befc3d65-441e-4be1-9e02-7a5183e1a2d0" (UID: "befc3d65-441e-4be1-9e02-7a5183e1a2d0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.627658 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/befc3d65-441e-4be1-9e02-7a5183e1a2d0-scripts" (OuterVolumeSpecName: "scripts") pod "befc3d65-441e-4be1-9e02-7a5183e1a2d0" (UID: "befc3d65-441e-4be1-9e02-7a5183e1a2d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.627901 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/befc3d65-441e-4be1-9e02-7a5183e1a2d0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "befc3d65-441e-4be1-9e02-7a5183e1a2d0" (UID: "befc3d65-441e-4be1-9e02-7a5183e1a2d0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.630316 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/befc3d65-441e-4be1-9e02-7a5183e1a2d0-kube-api-access-8xnff" (OuterVolumeSpecName: "kube-api-access-8xnff") pod "befc3d65-441e-4be1-9e02-7a5183e1a2d0" (UID: "befc3d65-441e-4be1-9e02-7a5183e1a2d0"). InnerVolumeSpecName "kube-api-access-8xnff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.630374 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "befc3d65-441e-4be1-9e02-7a5183e1a2d0" (UID: "befc3d65-441e-4be1-9e02-7a5183e1a2d0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.632558 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "befc3d65-441e-4be1-9e02-7a5183e1a2d0" (UID: "befc3d65-441e-4be1-9e02-7a5183e1a2d0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.641343 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "befc3d65-441e-4be1-9e02-7a5183e1a2d0" (UID: "befc3d65-441e-4be1-9e02-7a5183e1a2d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.733766 4885 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.733802 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.733813 4885 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/befc3d65-441e-4be1-9e02-7a5183e1a2d0-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.733821 4885 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/befc3d65-441e-4be1-9e02-7a5183e1a2d0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.733830 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/befc3d65-441e-4be1-9e02-7a5183e1a2d0-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.733838 4885 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/befc3d65-441e-4be1-9e02-7a5183e1a2d0-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.733849 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xnff\" (UniqueName: \"kubernetes.io/projected/befc3d65-441e-4be1-9e02-7a5183e1a2d0-kube-api-access-8xnff\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:10 crc kubenswrapper[4885]: I1205 20:23:10.912050 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2j6cb"] Dec 05 20:23:11 crc kubenswrapper[4885]: I1205 20:23:11.185739 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2522c28b-c324-408d-a8b7-7e3d83709a6a" path="/var/lib/kubelet/pods/2522c28b-c324-408d-a8b7-7e3d83709a6a/volumes" Dec 05 20:23:11 crc kubenswrapper[4885]: I1205 20:23:11.535687 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2j6cb" event={"ID":"c5c452f6-0d03-4e67-bab0-0dcb1926f523","Type":"ContainerStarted","Data":"9b965b6a64d75a3d5473b779741f86be30e440433515addd9c385dc9a5cbcb07"} Dec 05 20:23:11 crc kubenswrapper[4885]: I1205 20:23:11.539341 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vg7pf" Dec 05 20:23:11 crc kubenswrapper[4885]: I1205 20:23:11.539338 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784d65c867-xkgrp" event={"ID":"92f8ad64-3f8e-462a-91ae-091750185877","Type":"ContainerStarted","Data":"d3c0e86ab257c232b4a3bb1330d08496197fa91b31ae0ed4d06a8063472352f5"} Dec 05 20:23:11 crc kubenswrapper[4885]: I1205 20:23:11.540146 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:11 crc kubenswrapper[4885]: I1205 20:23:11.546302 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:11 crc kubenswrapper[4885]: E1205 20:23:11.546480 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 20:23:11 crc kubenswrapper[4885]: E1205 20:23:11.546748 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 20:23:11 crc kubenswrapper[4885]: E1205 20:23:11.546822 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift podName:18b127df-3095-45b6-b347-f1906d6317fe nodeName:}" failed. No retries permitted until 2025-12-05 20:23:13.546799856 +0000 UTC m=+1058.843615527 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift") pod "swift-storage-0" (UID: "18b127df-3095-45b6-b347-f1906d6317fe") : configmap "swift-ring-files" not found Dec 05 20:23:11 crc kubenswrapper[4885]: I1205 20:23:11.571867 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-784d65c867-xkgrp" podStartSLOduration=3.571839689 podStartE2EDuration="3.571839689s" podCreationTimestamp="2025-12-05 20:23:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:23:11.566868544 +0000 UTC m=+1056.863684225" watchObservedRunningTime="2025-12-05 20:23:11.571839689 +0000 UTC m=+1056.868655350" Dec 05 20:23:11 crc kubenswrapper[4885]: I1205 20:23:11.602059 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-vg7pf"] Dec 05 20:23:11 crc kubenswrapper[4885]: I1205 20:23:11.615256 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-vg7pf"] Dec 05 20:23:13 crc kubenswrapper[4885]: I1205 20:23:13.188831 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="befc3d65-441e-4be1-9e02-7a5183e1a2d0" path="/var/lib/kubelet/pods/befc3d65-441e-4be1-9e02-7a5183e1a2d0/volumes" Dec 05 20:23:13 crc kubenswrapper[4885]: I1205 20:23:13.579270 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:13 crc kubenswrapper[4885]: E1205 20:23:13.579508 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 20:23:13 crc kubenswrapper[4885]: E1205 20:23:13.579527 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 20:23:13 crc kubenswrapper[4885]: E1205 20:23:13.579581 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift podName:18b127df-3095-45b6-b347-f1906d6317fe nodeName:}" failed. No retries permitted until 2025-12-05 20:23:17.579561094 +0000 UTC m=+1062.876376755 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift") pod "swift-storage-0" (UID: "18b127df-3095-45b6-b347-f1906d6317fe") : configmap "swift-ring-files" not found Dec 05 20:23:14 crc kubenswrapper[4885]: I1205 20:23:14.576905 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2j6cb" event={"ID":"c5c452f6-0d03-4e67-bab0-0dcb1926f523","Type":"ContainerStarted","Data":"0a893f0189ea2ee6748126c1ab652ed26e8db8469b1f9b6e8b4c2eead6882e3f"} Dec 05 20:23:14 crc kubenswrapper[4885]: I1205 20:23:14.599897 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-2j6cb" podStartSLOduration=1.289755017 podStartE2EDuration="4.599879806s" podCreationTimestamp="2025-12-05 20:23:10 +0000 UTC" firstStartedPulling="2025-12-05 20:23:10.926203886 +0000 UTC m=+1056.223019547" lastFinishedPulling="2025-12-05 20:23:14.236328675 +0000 UTC m=+1059.533144336" observedRunningTime="2025-12-05 20:23:14.594062235 +0000 UTC m=+1059.890877896" watchObservedRunningTime="2025-12-05 20:23:14.599879806 +0000 UTC m=+1059.896695467" Dec 05 20:23:14 crc kubenswrapper[4885]: I1205 20:23:14.956513 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 05 20:23:15 crc kubenswrapper[4885]: I1205 20:23:15.075297 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.406137 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4e7e-account-create-update-8xkqd"] Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.407701 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4e7e-account-create-update-8xkqd" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.409249 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.420120 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4e7e-account-create-update-8xkqd"] Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.467097 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-nlh7l"] Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.468060 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-nlh7l" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.477619 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-nlh7l"] Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.525331 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cb59301-abf6-47e6-9f76-86e7908c07f2-operator-scripts\") pod \"keystone-4e7e-account-create-update-8xkqd\" (UID: \"7cb59301-abf6-47e6-9f76-86e7908c07f2\") " pod="openstack/keystone-4e7e-account-create-update-8xkqd" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.525380 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4nt2\" (UniqueName: \"kubernetes.io/projected/7cb59301-abf6-47e6-9f76-86e7908c07f2-kube-api-access-d4nt2\") pod \"keystone-4e7e-account-create-update-8xkqd\" (UID: \"7cb59301-abf6-47e6-9f76-86e7908c07f2\") " pod="openstack/keystone-4e7e-account-create-update-8xkqd" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.627279 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cb59301-abf6-47e6-9f76-86e7908c07f2-operator-scripts\") pod \"keystone-4e7e-account-create-update-8xkqd\" (UID: \"7cb59301-abf6-47e6-9f76-86e7908c07f2\") " pod="openstack/keystone-4e7e-account-create-update-8xkqd" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.627344 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4nt2\" (UniqueName: \"kubernetes.io/projected/7cb59301-abf6-47e6-9f76-86e7908c07f2-kube-api-access-d4nt2\") pod \"keystone-4e7e-account-create-update-8xkqd\" (UID: \"7cb59301-abf6-47e6-9f76-86e7908c07f2\") " pod="openstack/keystone-4e7e-account-create-update-8xkqd" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.627482 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qkmj\" (UniqueName: \"kubernetes.io/projected/4cdb57a5-2227-4495-b30d-e0867eba0435-kube-api-access-4qkmj\") pod \"keystone-db-create-nlh7l\" (UID: \"4cdb57a5-2227-4495-b30d-e0867eba0435\") " pod="openstack/keystone-db-create-nlh7l" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.627561 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cdb57a5-2227-4495-b30d-e0867eba0435-operator-scripts\") pod \"keystone-db-create-nlh7l\" (UID: \"4cdb57a5-2227-4495-b30d-e0867eba0435\") " pod="openstack/keystone-db-create-nlh7l" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.628073 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cb59301-abf6-47e6-9f76-86e7908c07f2-operator-scripts\") pod \"keystone-4e7e-account-create-update-8xkqd\" (UID: \"7cb59301-abf6-47e6-9f76-86e7908c07f2\") " pod="openstack/keystone-4e7e-account-create-update-8xkqd" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.656118 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4nt2\" (UniqueName: \"kubernetes.io/projected/7cb59301-abf6-47e6-9f76-86e7908c07f2-kube-api-access-d4nt2\") pod \"keystone-4e7e-account-create-update-8xkqd\" (UID: \"7cb59301-abf6-47e6-9f76-86e7908c07f2\") " pod="openstack/keystone-4e7e-account-create-update-8xkqd" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.724845 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4e7e-account-create-update-8xkqd" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.733216 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qkmj\" (UniqueName: \"kubernetes.io/projected/4cdb57a5-2227-4495-b30d-e0867eba0435-kube-api-access-4qkmj\") pod \"keystone-db-create-nlh7l\" (UID: \"4cdb57a5-2227-4495-b30d-e0867eba0435\") " pod="openstack/keystone-db-create-nlh7l" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.733314 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cdb57a5-2227-4495-b30d-e0867eba0435-operator-scripts\") pod \"keystone-db-create-nlh7l\" (UID: \"4cdb57a5-2227-4495-b30d-e0867eba0435\") " pod="openstack/keystone-db-create-nlh7l" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.734273 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cdb57a5-2227-4495-b30d-e0867eba0435-operator-scripts\") pod \"keystone-db-create-nlh7l\" (UID: \"4cdb57a5-2227-4495-b30d-e0867eba0435\") " pod="openstack/keystone-db-create-nlh7l" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.754670 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qkmj\" (UniqueName: \"kubernetes.io/projected/4cdb57a5-2227-4495-b30d-e0867eba0435-kube-api-access-4qkmj\") pod \"keystone-db-create-nlh7l\" (UID: \"4cdb57a5-2227-4495-b30d-e0867eba0435\") " pod="openstack/keystone-db-create-nlh7l" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.781962 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-nlh7l" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.865615 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-kphfn"] Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.867520 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kphfn" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.894256 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kphfn"] Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.940736 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-4a97-account-create-update-58vvg"] Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.942313 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4a97-account-create-update-58vvg" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.945421 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 05 20:23:16 crc kubenswrapper[4885]: I1205 20:23:16.953333 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4a97-account-create-update-58vvg"] Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.043973 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/080544c2-141c-49d0-86a9-533fefe28a4f-operator-scripts\") pod \"placement-db-create-kphfn\" (UID: \"080544c2-141c-49d0-86a9-533fefe28a4f\") " pod="openstack/placement-db-create-kphfn" Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.044321 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4741a673-bd48-498b-bade-5b2dfb1b0cce-operator-scripts\") pod \"placement-4a97-account-create-update-58vvg\" (UID: \"4741a673-bd48-498b-bade-5b2dfb1b0cce\") " pod="openstack/placement-4a97-account-create-update-58vvg" Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.044387 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mt8f\" (UniqueName: \"kubernetes.io/projected/080544c2-141c-49d0-86a9-533fefe28a4f-kube-api-access-5mt8f\") pod \"placement-db-create-kphfn\" (UID: \"080544c2-141c-49d0-86a9-533fefe28a4f\") " pod="openstack/placement-db-create-kphfn" Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.044452 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbfqq\" (UniqueName: \"kubernetes.io/projected/4741a673-bd48-498b-bade-5b2dfb1b0cce-kube-api-access-rbfqq\") pod \"placement-4a97-account-create-update-58vvg\" (UID: \"4741a673-bd48-498b-bade-5b2dfb1b0cce\") " pod="openstack/placement-4a97-account-create-update-58vvg" Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.145474 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/080544c2-141c-49d0-86a9-533fefe28a4f-operator-scripts\") pod \"placement-db-create-kphfn\" (UID: \"080544c2-141c-49d0-86a9-533fefe28a4f\") " pod="openstack/placement-db-create-kphfn" Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.145521 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4741a673-bd48-498b-bade-5b2dfb1b0cce-operator-scripts\") pod \"placement-4a97-account-create-update-58vvg\" (UID: \"4741a673-bd48-498b-bade-5b2dfb1b0cce\") " pod="openstack/placement-4a97-account-create-update-58vvg" Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.145567 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mt8f\" (UniqueName: \"kubernetes.io/projected/080544c2-141c-49d0-86a9-533fefe28a4f-kube-api-access-5mt8f\") pod \"placement-db-create-kphfn\" (UID: \"080544c2-141c-49d0-86a9-533fefe28a4f\") " pod="openstack/placement-db-create-kphfn" Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.145597 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbfqq\" (UniqueName: \"kubernetes.io/projected/4741a673-bd48-498b-bade-5b2dfb1b0cce-kube-api-access-rbfqq\") pod \"placement-4a97-account-create-update-58vvg\" (UID: \"4741a673-bd48-498b-bade-5b2dfb1b0cce\") " pod="openstack/placement-4a97-account-create-update-58vvg" Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.146327 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/080544c2-141c-49d0-86a9-533fefe28a4f-operator-scripts\") pod \"placement-db-create-kphfn\" (UID: \"080544c2-141c-49d0-86a9-533fefe28a4f\") " pod="openstack/placement-db-create-kphfn" Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.146340 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4741a673-bd48-498b-bade-5b2dfb1b0cce-operator-scripts\") pod \"placement-4a97-account-create-update-58vvg\" (UID: \"4741a673-bd48-498b-bade-5b2dfb1b0cce\") " pod="openstack/placement-4a97-account-create-update-58vvg" Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.169553 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbfqq\" (UniqueName: \"kubernetes.io/projected/4741a673-bd48-498b-bade-5b2dfb1b0cce-kube-api-access-rbfqq\") pod \"placement-4a97-account-create-update-58vvg\" (UID: \"4741a673-bd48-498b-bade-5b2dfb1b0cce\") " pod="openstack/placement-4a97-account-create-update-58vvg" Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.170450 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mt8f\" (UniqueName: \"kubernetes.io/projected/080544c2-141c-49d0-86a9-533fefe28a4f-kube-api-access-5mt8f\") pod \"placement-db-create-kphfn\" (UID: \"080544c2-141c-49d0-86a9-533fefe28a4f\") " pod="openstack/placement-db-create-kphfn" Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.212967 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4e7e-account-create-update-8xkqd"] Dec 05 20:23:17 crc kubenswrapper[4885]: W1205 20:23:17.216459 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cb59301_abf6_47e6_9f76_86e7908c07f2.slice/crio-47d63c9249ccd11e1797f2ad8f9a9ae5be6ab9e589c26e246ec9c2145145a9f9 WatchSource:0}: Error finding container 47d63c9249ccd11e1797f2ad8f9a9ae5be6ab9e589c26e246ec9c2145145a9f9: Status 404 returned error can't find the container with id 47d63c9249ccd11e1797f2ad8f9a9ae5be6ab9e589c26e246ec9c2145145a9f9 Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.231627 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kphfn" Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.263192 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4a97-account-create-update-58vvg" Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.324703 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-nlh7l"] Dec 05 20:23:17 crc kubenswrapper[4885]: W1205 20:23:17.452873 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cdb57a5_2227_4495_b30d_e0867eba0435.slice/crio-bbcc8ee9cb683beaae2321f82cedfd79282629ac860488c94117aaec5cbc0461 WatchSource:0}: Error finding container bbcc8ee9cb683beaae2321f82cedfd79282629ac860488c94117aaec5cbc0461: Status 404 returned error can't find the container with id bbcc8ee9cb683beaae2321f82cedfd79282629ac860488c94117aaec5cbc0461 Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.605862 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4e7e-account-create-update-8xkqd" event={"ID":"7cb59301-abf6-47e6-9f76-86e7908c07f2","Type":"ContainerStarted","Data":"08a9618affed1c0c7b45043b426f556e6034ceef394f856fa6e0c76bc9428633"} Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.605931 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4e7e-account-create-update-8xkqd" event={"ID":"7cb59301-abf6-47e6-9f76-86e7908c07f2","Type":"ContainerStarted","Data":"47d63c9249ccd11e1797f2ad8f9a9ae5be6ab9e589c26e246ec9c2145145a9f9"} Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.608198 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-nlh7l" event={"ID":"4cdb57a5-2227-4495-b30d-e0867eba0435","Type":"ContainerStarted","Data":"90a9df812a347b1c68cb393348dbca8f688fadb9f85e156a8f21a37fa650fc72"} Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.608231 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-nlh7l" event={"ID":"4cdb57a5-2227-4495-b30d-e0867eba0435","Type":"ContainerStarted","Data":"bbcc8ee9cb683beaae2321f82cedfd79282629ac860488c94117aaec5cbc0461"} Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.625013 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-4e7e-account-create-update-8xkqd" podStartSLOduration=1.6249940619999999 podStartE2EDuration="1.624994062s" podCreationTimestamp="2025-12-05 20:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:23:17.620103979 +0000 UTC m=+1062.916919640" watchObservedRunningTime="2025-12-05 20:23:17.624994062 +0000 UTC m=+1062.921809723" Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.652130 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:17 crc kubenswrapper[4885]: E1205 20:23:17.652316 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 20:23:17 crc kubenswrapper[4885]: E1205 20:23:17.652333 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 20:23:17 crc kubenswrapper[4885]: E1205 20:23:17.652376 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift podName:18b127df-3095-45b6-b347-f1906d6317fe nodeName:}" failed. No retries permitted until 2025-12-05 20:23:25.652359588 +0000 UTC m=+1070.949175249 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift") pod "swift-storage-0" (UID: "18b127df-3095-45b6-b347-f1906d6317fe") : configmap "swift-ring-files" not found Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.670967 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-nlh7l" podStartSLOduration=1.670923148 podStartE2EDuration="1.670923148s" podCreationTimestamp="2025-12-05 20:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:23:17.636173202 +0000 UTC m=+1062.932988863" watchObservedRunningTime="2025-12-05 20:23:17.670923148 +0000 UTC m=+1062.967738809" Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.676693 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kphfn"] Dec 05 20:23:17 crc kubenswrapper[4885]: W1205 20:23:17.677433 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod080544c2_141c_49d0_86a9_533fefe28a4f.slice/crio-357d509981bdd221cf16775f43ef88e1038c4883893b4e97e9ce22a4da64dee4 WatchSource:0}: Error finding container 357d509981bdd221cf16775f43ef88e1038c4883893b4e97e9ce22a4da64dee4: Status 404 returned error can't find the container with id 357d509981bdd221cf16775f43ef88e1038c4883893b4e97e9ce22a4da64dee4 Dec 05 20:23:17 crc kubenswrapper[4885]: I1205 20:23:17.748278 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4a97-account-create-update-58vvg"] Dec 05 20:23:17 crc kubenswrapper[4885]: W1205 20:23:17.769048 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4741a673_bd48_498b_bade_5b2dfb1b0cce.slice/crio-c0d745509f6423a31533043122739f8723b01024e8f7262609807e53736ff3bb WatchSource:0}: Error finding container c0d745509f6423a31533043122739f8723b01024e8f7262609807e53736ff3bb: Status 404 returned error can't find the container with id c0d745509f6423a31533043122739f8723b01024e8f7262609807e53736ff3bb Dec 05 20:23:18 crc kubenswrapper[4885]: I1205 20:23:18.621801 4885 generic.go:334] "Generic (PLEG): container finished" podID="4741a673-bd48-498b-bade-5b2dfb1b0cce" containerID="285344bc6bd10b5d64b925d943af4a7898fbcecec219dc4f6bbc71d172313cb8" exitCode=0 Dec 05 20:23:18 crc kubenswrapper[4885]: I1205 20:23:18.622345 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4a97-account-create-update-58vvg" event={"ID":"4741a673-bd48-498b-bade-5b2dfb1b0cce","Type":"ContainerDied","Data":"285344bc6bd10b5d64b925d943af4a7898fbcecec219dc4f6bbc71d172313cb8"} Dec 05 20:23:18 crc kubenswrapper[4885]: I1205 20:23:18.622389 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4a97-account-create-update-58vvg" event={"ID":"4741a673-bd48-498b-bade-5b2dfb1b0cce","Type":"ContainerStarted","Data":"c0d745509f6423a31533043122739f8723b01024e8f7262609807e53736ff3bb"} Dec 05 20:23:18 crc kubenswrapper[4885]: I1205 20:23:18.628710 4885 generic.go:334] "Generic (PLEG): container finished" podID="080544c2-141c-49d0-86a9-533fefe28a4f" containerID="34d33acc37f610c18801fded2160e52ac8145fb1c29cdcc67c068ddacc35df31" exitCode=0 Dec 05 20:23:18 crc kubenswrapper[4885]: I1205 20:23:18.628813 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kphfn" event={"ID":"080544c2-141c-49d0-86a9-533fefe28a4f","Type":"ContainerDied","Data":"34d33acc37f610c18801fded2160e52ac8145fb1c29cdcc67c068ddacc35df31"} Dec 05 20:23:18 crc kubenswrapper[4885]: I1205 20:23:18.628892 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kphfn" event={"ID":"080544c2-141c-49d0-86a9-533fefe28a4f","Type":"ContainerStarted","Data":"357d509981bdd221cf16775f43ef88e1038c4883893b4e97e9ce22a4da64dee4"} Dec 05 20:23:18 crc kubenswrapper[4885]: I1205 20:23:18.631416 4885 generic.go:334] "Generic (PLEG): container finished" podID="4cdb57a5-2227-4495-b30d-e0867eba0435" containerID="90a9df812a347b1c68cb393348dbca8f688fadb9f85e156a8f21a37fa650fc72" exitCode=0 Dec 05 20:23:18 crc kubenswrapper[4885]: I1205 20:23:18.631480 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-nlh7l" event={"ID":"4cdb57a5-2227-4495-b30d-e0867eba0435","Type":"ContainerDied","Data":"90a9df812a347b1c68cb393348dbca8f688fadb9f85e156a8f21a37fa650fc72"} Dec 05 20:23:18 crc kubenswrapper[4885]: I1205 20:23:18.634368 4885 generic.go:334] "Generic (PLEG): container finished" podID="7cb59301-abf6-47e6-9f76-86e7908c07f2" containerID="08a9618affed1c0c7b45043b426f556e6034ceef394f856fa6e0c76bc9428633" exitCode=0 Dec 05 20:23:18 crc kubenswrapper[4885]: I1205 20:23:18.634397 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4e7e-account-create-update-8xkqd" event={"ID":"7cb59301-abf6-47e6-9f76-86e7908c07f2","Type":"ContainerDied","Data":"08a9618affed1c0c7b45043b426f556e6034ceef394f856fa6e0c76bc9428633"} Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.057227 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.137214 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-gh96d"] Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.137484 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-984c76dd7-gh96d" podUID="89ebc5ff-11c5-4724-bcbf-60a4eb26508b" containerName="dnsmasq-dns" containerID="cri-o://9c3fe83b737d64eb72724dfe898171da23229e29f5b64d26e01e947ff8a0a044" gracePeriod=10 Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.621461 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.645045 4885 generic.go:334] "Generic (PLEG): container finished" podID="89ebc5ff-11c5-4724-bcbf-60a4eb26508b" containerID="9c3fe83b737d64eb72724dfe898171da23229e29f5b64d26e01e947ff8a0a044" exitCode=0 Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.645220 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-984c76dd7-gh96d" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.645566 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-984c76dd7-gh96d" event={"ID":"89ebc5ff-11c5-4724-bcbf-60a4eb26508b","Type":"ContainerDied","Data":"9c3fe83b737d64eb72724dfe898171da23229e29f5b64d26e01e947ff8a0a044"} Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.645591 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-984c76dd7-gh96d" event={"ID":"89ebc5ff-11c5-4724-bcbf-60a4eb26508b","Type":"ContainerDied","Data":"5cde5dd84694cfe70ca05c1cb175ae4b6ac29718804f74bfde16232279ed46f7"} Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.645605 4885 scope.go:117] "RemoveContainer" containerID="9c3fe83b737d64eb72724dfe898171da23229e29f5b64d26e01e947ff8a0a044" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.682551 4885 scope.go:117] "RemoveContainer" containerID="69debaac6aa9698c99bb2034740ecb25c839548c2eceffba8c22009a2c25981d" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.717826 4885 scope.go:117] "RemoveContainer" containerID="9c3fe83b737d64eb72724dfe898171da23229e29f5b64d26e01e947ff8a0a044" Dec 05 20:23:19 crc kubenswrapper[4885]: E1205 20:23:19.718363 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c3fe83b737d64eb72724dfe898171da23229e29f5b64d26e01e947ff8a0a044\": container with ID starting with 9c3fe83b737d64eb72724dfe898171da23229e29f5b64d26e01e947ff8a0a044 not found: ID does not exist" containerID="9c3fe83b737d64eb72724dfe898171da23229e29f5b64d26e01e947ff8a0a044" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.718403 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c3fe83b737d64eb72724dfe898171da23229e29f5b64d26e01e947ff8a0a044"} err="failed to get container status \"9c3fe83b737d64eb72724dfe898171da23229e29f5b64d26e01e947ff8a0a044\": rpc error: code = NotFound desc = could not find container \"9c3fe83b737d64eb72724dfe898171da23229e29f5b64d26e01e947ff8a0a044\": container with ID starting with 9c3fe83b737d64eb72724dfe898171da23229e29f5b64d26e01e947ff8a0a044 not found: ID does not exist" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.718431 4885 scope.go:117] "RemoveContainer" containerID="69debaac6aa9698c99bb2034740ecb25c839548c2eceffba8c22009a2c25981d" Dec 05 20:23:19 crc kubenswrapper[4885]: E1205 20:23:19.718741 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69debaac6aa9698c99bb2034740ecb25c839548c2eceffba8c22009a2c25981d\": container with ID starting with 69debaac6aa9698c99bb2034740ecb25c839548c2eceffba8c22009a2c25981d not found: ID does not exist" containerID="69debaac6aa9698c99bb2034740ecb25c839548c2eceffba8c22009a2c25981d" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.718771 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69debaac6aa9698c99bb2034740ecb25c839548c2eceffba8c22009a2c25981d"} err="failed to get container status \"69debaac6aa9698c99bb2034740ecb25c839548c2eceffba8c22009a2c25981d\": rpc error: code = NotFound desc = could not find container \"69debaac6aa9698c99bb2034740ecb25c839548c2eceffba8c22009a2c25981d\": container with ID starting with 69debaac6aa9698c99bb2034740ecb25c839548c2eceffba8c22009a2c25981d not found: ID does not exist" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.791291 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxhwr\" (UniqueName: \"kubernetes.io/projected/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-kube-api-access-rxhwr\") pod \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.791351 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-ovsdbserver-nb\") pod \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.791387 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-config\") pod \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.791417 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-dns-svc\") pod \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.791497 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-ovsdbserver-sb\") pod \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\" (UID: \"89ebc5ff-11c5-4724-bcbf-60a4eb26508b\") " Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.799803 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-kube-api-access-rxhwr" (OuterVolumeSpecName: "kube-api-access-rxhwr") pod "89ebc5ff-11c5-4724-bcbf-60a4eb26508b" (UID: "89ebc5ff-11c5-4724-bcbf-60a4eb26508b"). InnerVolumeSpecName "kube-api-access-rxhwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.843074 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-config" (OuterVolumeSpecName: "config") pod "89ebc5ff-11c5-4724-bcbf-60a4eb26508b" (UID: "89ebc5ff-11c5-4724-bcbf-60a4eb26508b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.844180 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "89ebc5ff-11c5-4724-bcbf-60a4eb26508b" (UID: "89ebc5ff-11c5-4724-bcbf-60a4eb26508b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.845801 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "89ebc5ff-11c5-4724-bcbf-60a4eb26508b" (UID: "89ebc5ff-11c5-4724-bcbf-60a4eb26508b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.851104 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "89ebc5ff-11c5-4724-bcbf-60a4eb26508b" (UID: "89ebc5ff-11c5-4724-bcbf-60a4eb26508b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.894315 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxhwr\" (UniqueName: \"kubernetes.io/projected/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-kube-api-access-rxhwr\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.894634 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.894643 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.894651 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:19 crc kubenswrapper[4885]: I1205 20:23:19.894659 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89ebc5ff-11c5-4724-bcbf-60a4eb26508b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.015460 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-gh96d"] Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.021116 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-gh96d"] Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.021152 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-nlh7l" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.140375 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4e7e-account-create-update-8xkqd" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.158349 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4a97-account-create-update-58vvg" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.174207 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kphfn" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.197546 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cdb57a5-2227-4495-b30d-e0867eba0435-operator-scripts\") pod \"4cdb57a5-2227-4495-b30d-e0867eba0435\" (UID: \"4cdb57a5-2227-4495-b30d-e0867eba0435\") " Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.197772 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qkmj\" (UniqueName: \"kubernetes.io/projected/4cdb57a5-2227-4495-b30d-e0867eba0435-kube-api-access-4qkmj\") pod \"4cdb57a5-2227-4495-b30d-e0867eba0435\" (UID: \"4cdb57a5-2227-4495-b30d-e0867eba0435\") " Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.198411 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cdb57a5-2227-4495-b30d-e0867eba0435-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4cdb57a5-2227-4495-b30d-e0867eba0435" (UID: "4cdb57a5-2227-4495-b30d-e0867eba0435"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.203342 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cdb57a5-2227-4495-b30d-e0867eba0435-kube-api-access-4qkmj" (OuterVolumeSpecName: "kube-api-access-4qkmj") pod "4cdb57a5-2227-4495-b30d-e0867eba0435" (UID: "4cdb57a5-2227-4495-b30d-e0867eba0435"). InnerVolumeSpecName "kube-api-access-4qkmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.303202 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4nt2\" (UniqueName: \"kubernetes.io/projected/7cb59301-abf6-47e6-9f76-86e7908c07f2-kube-api-access-d4nt2\") pod \"7cb59301-abf6-47e6-9f76-86e7908c07f2\" (UID: \"7cb59301-abf6-47e6-9f76-86e7908c07f2\") " Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.303299 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cb59301-abf6-47e6-9f76-86e7908c07f2-operator-scripts\") pod \"7cb59301-abf6-47e6-9f76-86e7908c07f2\" (UID: \"7cb59301-abf6-47e6-9f76-86e7908c07f2\") " Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.303510 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbfqq\" (UniqueName: \"kubernetes.io/projected/4741a673-bd48-498b-bade-5b2dfb1b0cce-kube-api-access-rbfqq\") pod \"4741a673-bd48-498b-bade-5b2dfb1b0cce\" (UID: \"4741a673-bd48-498b-bade-5b2dfb1b0cce\") " Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.304265 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/080544c2-141c-49d0-86a9-533fefe28a4f-operator-scripts\") pod \"080544c2-141c-49d0-86a9-533fefe28a4f\" (UID: \"080544c2-141c-49d0-86a9-533fefe28a4f\") " Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.304341 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4741a673-bd48-498b-bade-5b2dfb1b0cce-operator-scripts\") pod \"4741a673-bd48-498b-bade-5b2dfb1b0cce\" (UID: \"4741a673-bd48-498b-bade-5b2dfb1b0cce\") " Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.304415 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mt8f\" (UniqueName: \"kubernetes.io/projected/080544c2-141c-49d0-86a9-533fefe28a4f-kube-api-access-5mt8f\") pod \"080544c2-141c-49d0-86a9-533fefe28a4f\" (UID: \"080544c2-141c-49d0-86a9-533fefe28a4f\") " Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.305200 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qkmj\" (UniqueName: \"kubernetes.io/projected/4cdb57a5-2227-4495-b30d-e0867eba0435-kube-api-access-4qkmj\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.305226 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cdb57a5-2227-4495-b30d-e0867eba0435-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.305308 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4741a673-bd48-498b-bade-5b2dfb1b0cce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4741a673-bd48-498b-bade-5b2dfb1b0cce" (UID: "4741a673-bd48-498b-bade-5b2dfb1b0cce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.305341 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/080544c2-141c-49d0-86a9-533fefe28a4f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "080544c2-141c-49d0-86a9-533fefe28a4f" (UID: "080544c2-141c-49d0-86a9-533fefe28a4f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.305363 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cb59301-abf6-47e6-9f76-86e7908c07f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7cb59301-abf6-47e6-9f76-86e7908c07f2" (UID: "7cb59301-abf6-47e6-9f76-86e7908c07f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.309774 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb59301-abf6-47e6-9f76-86e7908c07f2-kube-api-access-d4nt2" (OuterVolumeSpecName: "kube-api-access-d4nt2") pod "7cb59301-abf6-47e6-9f76-86e7908c07f2" (UID: "7cb59301-abf6-47e6-9f76-86e7908c07f2"). InnerVolumeSpecName "kube-api-access-d4nt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.310126 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4741a673-bd48-498b-bade-5b2dfb1b0cce-kube-api-access-rbfqq" (OuterVolumeSpecName: "kube-api-access-rbfqq") pod "4741a673-bd48-498b-bade-5b2dfb1b0cce" (UID: "4741a673-bd48-498b-bade-5b2dfb1b0cce"). InnerVolumeSpecName "kube-api-access-rbfqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.310319 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/080544c2-141c-49d0-86a9-533fefe28a4f-kube-api-access-5mt8f" (OuterVolumeSpecName: "kube-api-access-5mt8f") pod "080544c2-141c-49d0-86a9-533fefe28a4f" (UID: "080544c2-141c-49d0-86a9-533fefe28a4f"). InnerVolumeSpecName "kube-api-access-5mt8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.406604 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/080544c2-141c-49d0-86a9-533fefe28a4f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.406956 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4741a673-bd48-498b-bade-5b2dfb1b0cce-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.406968 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mt8f\" (UniqueName: \"kubernetes.io/projected/080544c2-141c-49d0-86a9-533fefe28a4f-kube-api-access-5mt8f\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.406979 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4nt2\" (UniqueName: \"kubernetes.io/projected/7cb59301-abf6-47e6-9f76-86e7908c07f2-kube-api-access-d4nt2\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.407009 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cb59301-abf6-47e6-9f76-86e7908c07f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.407043 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbfqq\" (UniqueName: \"kubernetes.io/projected/4741a673-bd48-498b-bade-5b2dfb1b0cce-kube-api-access-rbfqq\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.671239 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4e7e-account-create-update-8xkqd" event={"ID":"7cb59301-abf6-47e6-9f76-86e7908c07f2","Type":"ContainerDied","Data":"47d63c9249ccd11e1797f2ad8f9a9ae5be6ab9e589c26e246ec9c2145145a9f9"} Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.672468 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47d63c9249ccd11e1797f2ad8f9a9ae5be6ab9e589c26e246ec9c2145145a9f9" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.671321 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4e7e-account-create-update-8xkqd" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.677406 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4a97-account-create-update-58vvg" event={"ID":"4741a673-bd48-498b-bade-5b2dfb1b0cce","Type":"ContainerDied","Data":"c0d745509f6423a31533043122739f8723b01024e8f7262609807e53736ff3bb"} Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.677471 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0d745509f6423a31533043122739f8723b01024e8f7262609807e53736ff3bb" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.677567 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4a97-account-create-update-58vvg" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.682347 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kphfn" event={"ID":"080544c2-141c-49d0-86a9-533fefe28a4f","Type":"ContainerDied","Data":"357d509981bdd221cf16775f43ef88e1038c4883893b4e97e9ce22a4da64dee4"} Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.682406 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="357d509981bdd221cf16775f43ef88e1038c4883893b4e97e9ce22a4da64dee4" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.682483 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kphfn" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.686046 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-nlh7l" event={"ID":"4cdb57a5-2227-4495-b30d-e0867eba0435","Type":"ContainerDied","Data":"bbcc8ee9cb683beaae2321f82cedfd79282629ac860488c94117aaec5cbc0461"} Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.686084 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbcc8ee9cb683beaae2321f82cedfd79282629ac860488c94117aaec5cbc0461" Dec 05 20:23:20 crc kubenswrapper[4885]: I1205 20:23:20.686151 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-nlh7l" Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.189465 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89ebc5ff-11c5-4724-bcbf-60a4eb26508b" path="/var/lib/kubelet/pods/89ebc5ff-11c5-4724-bcbf-60a4eb26508b/volumes" Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.700594 4885 generic.go:334] "Generic (PLEG): container finished" podID="c5c452f6-0d03-4e67-bab0-0dcb1926f523" containerID="0a893f0189ea2ee6748126c1ab652ed26e8db8469b1f9b6e8b4c2eead6882e3f" exitCode=0 Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.700729 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2j6cb" event={"ID":"c5c452f6-0d03-4e67-bab0-0dcb1926f523","Type":"ContainerDied","Data":"0a893f0189ea2ee6748126c1ab652ed26e8db8469b1f9b6e8b4c2eead6882e3f"} Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.940253 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-z548t"] Dec 05 20:23:21 crc kubenswrapper[4885]: E1205 20:23:21.940906 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ebc5ff-11c5-4724-bcbf-60a4eb26508b" containerName="dnsmasq-dns" Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.940926 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ebc5ff-11c5-4724-bcbf-60a4eb26508b" containerName="dnsmasq-dns" Dec 05 20:23:21 crc kubenswrapper[4885]: E1205 20:23:21.940959 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4741a673-bd48-498b-bade-5b2dfb1b0cce" containerName="mariadb-account-create-update" Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.940967 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4741a673-bd48-498b-bade-5b2dfb1b0cce" containerName="mariadb-account-create-update" Dec 05 20:23:21 crc kubenswrapper[4885]: E1205 20:23:21.940990 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb59301-abf6-47e6-9f76-86e7908c07f2" containerName="mariadb-account-create-update" Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.941004 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb59301-abf6-47e6-9f76-86e7908c07f2" containerName="mariadb-account-create-update" Dec 05 20:23:21 crc kubenswrapper[4885]: E1205 20:23:21.941041 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ebc5ff-11c5-4724-bcbf-60a4eb26508b" containerName="init" Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.941056 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ebc5ff-11c5-4724-bcbf-60a4eb26508b" containerName="init" Dec 05 20:23:21 crc kubenswrapper[4885]: E1205 20:23:21.941088 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080544c2-141c-49d0-86a9-533fefe28a4f" containerName="mariadb-database-create" Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.941098 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="080544c2-141c-49d0-86a9-533fefe28a4f" containerName="mariadb-database-create" Dec 05 20:23:21 crc kubenswrapper[4885]: E1205 20:23:21.941127 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cdb57a5-2227-4495-b30d-e0867eba0435" containerName="mariadb-database-create" Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.941135 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cdb57a5-2227-4495-b30d-e0867eba0435" containerName="mariadb-database-create" Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.941507 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="080544c2-141c-49d0-86a9-533fefe28a4f" containerName="mariadb-database-create" Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.941539 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="89ebc5ff-11c5-4724-bcbf-60a4eb26508b" containerName="dnsmasq-dns" Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.941557 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4741a673-bd48-498b-bade-5b2dfb1b0cce" containerName="mariadb-account-create-update" Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.941581 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cdb57a5-2227-4495-b30d-e0867eba0435" containerName="mariadb-database-create" Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.941598 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb59301-abf6-47e6-9f76-86e7908c07f2" containerName="mariadb-account-create-update" Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.943649 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-z548t" Dec 05 20:23:21 crc kubenswrapper[4885]: I1205 20:23:21.970807 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-z548t"] Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.034972 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9380-account-create-update-dnrjv"] Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.035970 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9380-account-create-update-dnrjv" Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.038532 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.040776 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82526f77-6157-4c1e-b14d-72e377c0971b-operator-scripts\") pod \"glance-db-create-z548t\" (UID: \"82526f77-6157-4c1e-b14d-72e377c0971b\") " pod="openstack/glance-db-create-z548t" Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.040836 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttlvw\" (UniqueName: \"kubernetes.io/projected/82526f77-6157-4c1e-b14d-72e377c0971b-kube-api-access-ttlvw\") pod \"glance-db-create-z548t\" (UID: \"82526f77-6157-4c1e-b14d-72e377c0971b\") " pod="openstack/glance-db-create-z548t" Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.046760 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9380-account-create-update-dnrjv"] Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.142371 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82526f77-6157-4c1e-b14d-72e377c0971b-operator-scripts\") pod \"glance-db-create-z548t\" (UID: \"82526f77-6157-4c1e-b14d-72e377c0971b\") " pod="openstack/glance-db-create-z548t" Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.142407 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttlvw\" (UniqueName: \"kubernetes.io/projected/82526f77-6157-4c1e-b14d-72e377c0971b-kube-api-access-ttlvw\") pod \"glance-db-create-z548t\" (UID: \"82526f77-6157-4c1e-b14d-72e377c0971b\") " pod="openstack/glance-db-create-z548t" Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.142489 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78491fc8-8cb0-489e-98b5-a3f19812d082-operator-scripts\") pod \"glance-9380-account-create-update-dnrjv\" (UID: \"78491fc8-8cb0-489e-98b5-a3f19812d082\") " pod="openstack/glance-9380-account-create-update-dnrjv" Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.142548 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlr5p\" (UniqueName: \"kubernetes.io/projected/78491fc8-8cb0-489e-98b5-a3f19812d082-kube-api-access-wlr5p\") pod \"glance-9380-account-create-update-dnrjv\" (UID: \"78491fc8-8cb0-489e-98b5-a3f19812d082\") " pod="openstack/glance-9380-account-create-update-dnrjv" Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.143290 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82526f77-6157-4c1e-b14d-72e377c0971b-operator-scripts\") pod \"glance-db-create-z548t\" (UID: \"82526f77-6157-4c1e-b14d-72e377c0971b\") " pod="openstack/glance-db-create-z548t" Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.157890 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttlvw\" (UniqueName: \"kubernetes.io/projected/82526f77-6157-4c1e-b14d-72e377c0971b-kube-api-access-ttlvw\") pod \"glance-db-create-z548t\" (UID: \"82526f77-6157-4c1e-b14d-72e377c0971b\") " pod="openstack/glance-db-create-z548t" Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.243674 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78491fc8-8cb0-489e-98b5-a3f19812d082-operator-scripts\") pod \"glance-9380-account-create-update-dnrjv\" (UID: \"78491fc8-8cb0-489e-98b5-a3f19812d082\") " pod="openstack/glance-9380-account-create-update-dnrjv" Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.244046 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlr5p\" (UniqueName: \"kubernetes.io/projected/78491fc8-8cb0-489e-98b5-a3f19812d082-kube-api-access-wlr5p\") pod \"glance-9380-account-create-update-dnrjv\" (UID: \"78491fc8-8cb0-489e-98b5-a3f19812d082\") " pod="openstack/glance-9380-account-create-update-dnrjv" Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.245115 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78491fc8-8cb0-489e-98b5-a3f19812d082-operator-scripts\") pod \"glance-9380-account-create-update-dnrjv\" (UID: \"78491fc8-8cb0-489e-98b5-a3f19812d082\") " pod="openstack/glance-9380-account-create-update-dnrjv" Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.259572 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlr5p\" (UniqueName: \"kubernetes.io/projected/78491fc8-8cb0-489e-98b5-a3f19812d082-kube-api-access-wlr5p\") pod \"glance-9380-account-create-update-dnrjv\" (UID: \"78491fc8-8cb0-489e-98b5-a3f19812d082\") " pod="openstack/glance-9380-account-create-update-dnrjv" Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.272270 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-z548t" Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.353008 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9380-account-create-update-dnrjv" Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.745170 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-z548t"] Dec 05 20:23:22 crc kubenswrapper[4885]: W1205 20:23:22.758770 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82526f77_6157_4c1e_b14d_72e377c0971b.slice/crio-2a943065cf829d0da49be29faca3db53f3d01c3c91526bbab8f656946a5af781 WatchSource:0}: Error finding container 2a943065cf829d0da49be29faca3db53f3d01c3c91526bbab8f656946a5af781: Status 404 returned error can't find the container with id 2a943065cf829d0da49be29faca3db53f3d01c3c91526bbab8f656946a5af781 Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.838212 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9380-account-create-update-dnrjv"] Dec 05 20:23:22 crc kubenswrapper[4885]: W1205 20:23:22.844801 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78491fc8_8cb0_489e_98b5_a3f19812d082.slice/crio-63944a33560b315ffd71f0b794202a37d24e3b879989e2a920ce077bcbb65932 WatchSource:0}: Error finding container 63944a33560b315ffd71f0b794202a37d24e3b879989e2a920ce077bcbb65932: Status 404 returned error can't find the container with id 63944a33560b315ffd71f0b794202a37d24e3b879989e2a920ce077bcbb65932 Dec 05 20:23:22 crc kubenswrapper[4885]: I1205 20:23:22.990250 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.161707 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5c452f6-0d03-4e67-bab0-0dcb1926f523-etc-swift\") pod \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.161797 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5c452f6-0d03-4e67-bab0-0dcb1926f523-scripts\") pod \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.161820 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-combined-ca-bundle\") pod \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.161861 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtz8c\" (UniqueName: \"kubernetes.io/projected/c5c452f6-0d03-4e67-bab0-0dcb1926f523-kube-api-access-xtz8c\") pod \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.161884 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5c452f6-0d03-4e67-bab0-0dcb1926f523-ring-data-devices\") pod \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.161945 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-swiftconf\") pod \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.162064 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-dispersionconf\") pod \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\" (UID: \"c5c452f6-0d03-4e67-bab0-0dcb1926f523\") " Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.162610 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c452f6-0d03-4e67-bab0-0dcb1926f523-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c5c452f6-0d03-4e67-bab0-0dcb1926f523" (UID: "c5c452f6-0d03-4e67-bab0-0dcb1926f523"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.163577 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5c452f6-0d03-4e67-bab0-0dcb1926f523-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c5c452f6-0d03-4e67-bab0-0dcb1926f523" (UID: "c5c452f6-0d03-4e67-bab0-0dcb1926f523"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.167952 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5c452f6-0d03-4e67-bab0-0dcb1926f523-kube-api-access-xtz8c" (OuterVolumeSpecName: "kube-api-access-xtz8c") pod "c5c452f6-0d03-4e67-bab0-0dcb1926f523" (UID: "c5c452f6-0d03-4e67-bab0-0dcb1926f523"). InnerVolumeSpecName "kube-api-access-xtz8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.170584 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c5c452f6-0d03-4e67-bab0-0dcb1926f523" (UID: "c5c452f6-0d03-4e67-bab0-0dcb1926f523"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.189955 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5c452f6-0d03-4e67-bab0-0dcb1926f523-scripts" (OuterVolumeSpecName: "scripts") pod "c5c452f6-0d03-4e67-bab0-0dcb1926f523" (UID: "c5c452f6-0d03-4e67-bab0-0dcb1926f523"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.192539 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c5c452f6-0d03-4e67-bab0-0dcb1926f523" (UID: "c5c452f6-0d03-4e67-bab0-0dcb1926f523"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.205288 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5c452f6-0d03-4e67-bab0-0dcb1926f523" (UID: "c5c452f6-0d03-4e67-bab0-0dcb1926f523"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.264057 4885 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5c452f6-0d03-4e67-bab0-0dcb1926f523-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.264225 4885 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.264298 4885 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.264362 4885 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5c452f6-0d03-4e67-bab0-0dcb1926f523-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.264432 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5c452f6-0d03-4e67-bab0-0dcb1926f523-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.264504 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5c452f6-0d03-4e67-bab0-0dcb1926f523-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.264580 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtz8c\" (UniqueName: \"kubernetes.io/projected/c5c452f6-0d03-4e67-bab0-0dcb1926f523-kube-api-access-xtz8c\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.720858 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2j6cb" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.720858 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2j6cb" event={"ID":"c5c452f6-0d03-4e67-bab0-0dcb1926f523","Type":"ContainerDied","Data":"9b965b6a64d75a3d5473b779741f86be30e440433515addd9c385dc9a5cbcb07"} Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.720998 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b965b6a64d75a3d5473b779741f86be30e440433515addd9c385dc9a5cbcb07" Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.722677 4885 generic.go:334] "Generic (PLEG): container finished" podID="78491fc8-8cb0-489e-98b5-a3f19812d082" containerID="eeba5ae1b91678d4ac3e809b5740ea2a82a2c0f7fd0de4fde8e69ac0969323b3" exitCode=0 Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.722844 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9380-account-create-update-dnrjv" event={"ID":"78491fc8-8cb0-489e-98b5-a3f19812d082","Type":"ContainerDied","Data":"eeba5ae1b91678d4ac3e809b5740ea2a82a2c0f7fd0de4fde8e69ac0969323b3"} Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.722913 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9380-account-create-update-dnrjv" event={"ID":"78491fc8-8cb0-489e-98b5-a3f19812d082","Type":"ContainerStarted","Data":"63944a33560b315ffd71f0b794202a37d24e3b879989e2a920ce077bcbb65932"} Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.724033 4885 generic.go:334] "Generic (PLEG): container finished" podID="82526f77-6157-4c1e-b14d-72e377c0971b" containerID="717a425951e4fb12b77d67dd1a63826a7c6d54ddbaa73f8c26ec23dffce1231b" exitCode=0 Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.724100 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-z548t" event={"ID":"82526f77-6157-4c1e-b14d-72e377c0971b","Type":"ContainerDied","Data":"717a425951e4fb12b77d67dd1a63826a7c6d54ddbaa73f8c26ec23dffce1231b"} Dec 05 20:23:23 crc kubenswrapper[4885]: I1205 20:23:23.724128 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-z548t" event={"ID":"82526f77-6157-4c1e-b14d-72e377c0971b","Type":"ContainerStarted","Data":"2a943065cf829d0da49be29faca3db53f3d01c3c91526bbab8f656946a5af781"} Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.232961 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-z548t" Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.239195 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9380-account-create-update-dnrjv" Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.399315 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78491fc8-8cb0-489e-98b5-a3f19812d082-operator-scripts\") pod \"78491fc8-8cb0-489e-98b5-a3f19812d082\" (UID: \"78491fc8-8cb0-489e-98b5-a3f19812d082\") " Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.399465 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82526f77-6157-4c1e-b14d-72e377c0971b-operator-scripts\") pod \"82526f77-6157-4c1e-b14d-72e377c0971b\" (UID: \"82526f77-6157-4c1e-b14d-72e377c0971b\") " Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.399819 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttlvw\" (UniqueName: \"kubernetes.io/projected/82526f77-6157-4c1e-b14d-72e377c0971b-kube-api-access-ttlvw\") pod \"82526f77-6157-4c1e-b14d-72e377c0971b\" (UID: \"82526f77-6157-4c1e-b14d-72e377c0971b\") " Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.399870 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlr5p\" (UniqueName: \"kubernetes.io/projected/78491fc8-8cb0-489e-98b5-a3f19812d082-kube-api-access-wlr5p\") pod \"78491fc8-8cb0-489e-98b5-a3f19812d082\" (UID: \"78491fc8-8cb0-489e-98b5-a3f19812d082\") " Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.400222 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82526f77-6157-4c1e-b14d-72e377c0971b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82526f77-6157-4c1e-b14d-72e377c0971b" (UID: "82526f77-6157-4c1e-b14d-72e377c0971b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.400222 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78491fc8-8cb0-489e-98b5-a3f19812d082-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "78491fc8-8cb0-489e-98b5-a3f19812d082" (UID: "78491fc8-8cb0-489e-98b5-a3f19812d082"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.400575 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78491fc8-8cb0-489e-98b5-a3f19812d082-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.400613 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82526f77-6157-4c1e-b14d-72e377c0971b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.405096 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78491fc8-8cb0-489e-98b5-a3f19812d082-kube-api-access-wlr5p" (OuterVolumeSpecName: "kube-api-access-wlr5p") pod "78491fc8-8cb0-489e-98b5-a3f19812d082" (UID: "78491fc8-8cb0-489e-98b5-a3f19812d082"). InnerVolumeSpecName "kube-api-access-wlr5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.405428 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82526f77-6157-4c1e-b14d-72e377c0971b-kube-api-access-ttlvw" (OuterVolumeSpecName: "kube-api-access-ttlvw") pod "82526f77-6157-4c1e-b14d-72e377c0971b" (UID: "82526f77-6157-4c1e-b14d-72e377c0971b"). InnerVolumeSpecName "kube-api-access-ttlvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.502066 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttlvw\" (UniqueName: \"kubernetes.io/projected/82526f77-6157-4c1e-b14d-72e377c0971b-kube-api-access-ttlvw\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.502362 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlr5p\" (UniqueName: \"kubernetes.io/projected/78491fc8-8cb0-489e-98b5-a3f19812d082-kube-api-access-wlr5p\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.705072 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.715122 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b127df-3095-45b6-b347-f1906d6317fe-etc-swift\") pod \"swift-storage-0\" (UID: \"18b127df-3095-45b6-b347-f1906d6317fe\") " pod="openstack/swift-storage-0" Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.745173 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-z548t" event={"ID":"82526f77-6157-4c1e-b14d-72e377c0971b","Type":"ContainerDied","Data":"2a943065cf829d0da49be29faca3db53f3d01c3c91526bbab8f656946a5af781"} Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.745230 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a943065cf829d0da49be29faca3db53f3d01c3c91526bbab8f656946a5af781" Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.745227 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-z548t" Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.747227 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9380-account-create-update-dnrjv" event={"ID":"78491fc8-8cb0-489e-98b5-a3f19812d082","Type":"ContainerDied","Data":"63944a33560b315ffd71f0b794202a37d24e3b879989e2a920ce077bcbb65932"} Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.747277 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63944a33560b315ffd71f0b794202a37d24e3b879989e2a920ce077bcbb65932" Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.747281 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9380-account-create-update-dnrjv" Dec 05 20:23:25 crc kubenswrapper[4885]: I1205 20:23:25.765323 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 20:23:26 crc kubenswrapper[4885]: I1205 20:23:26.358348 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 20:23:26 crc kubenswrapper[4885]: I1205 20:23:26.755527 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" containerID="92854c07f11776cd9ac2f61cab36a7565089c9ec58d08137d9efcb0447965bfe" exitCode=0 Dec 05 20:23:26 crc kubenswrapper[4885]: I1205 20:23:26.755593 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee","Type":"ContainerDied","Data":"92854c07f11776cd9ac2f61cab36a7565089c9ec58d08137d9efcb0447965bfe"} Dec 05 20:23:26 crc kubenswrapper[4885]: I1205 20:23:26.757681 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18b127df-3095-45b6-b347-f1906d6317fe","Type":"ContainerStarted","Data":"369303d31c81b66ab00361867eec49781c423fbac42a1b13cc8be71a866d46b9"} Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.209745 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-dsgxp"] Dec 05 20:23:27 crc kubenswrapper[4885]: E1205 20:23:27.210286 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78491fc8-8cb0-489e-98b5-a3f19812d082" containerName="mariadb-account-create-update" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.210304 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="78491fc8-8cb0-489e-98b5-a3f19812d082" containerName="mariadb-account-create-update" Dec 05 20:23:27 crc kubenswrapper[4885]: E1205 20:23:27.210313 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c452f6-0d03-4e67-bab0-0dcb1926f523" containerName="swift-ring-rebalance" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.210321 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c452f6-0d03-4e67-bab0-0dcb1926f523" containerName="swift-ring-rebalance" Dec 05 20:23:27 crc kubenswrapper[4885]: E1205 20:23:27.210334 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82526f77-6157-4c1e-b14d-72e377c0971b" containerName="mariadb-database-create" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.210341 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="82526f77-6157-4c1e-b14d-72e377c0971b" containerName="mariadb-database-create" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.210490 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c452f6-0d03-4e67-bab0-0dcb1926f523" containerName="swift-ring-rebalance" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.210510 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="78491fc8-8cb0-489e-98b5-a3f19812d082" containerName="mariadb-account-create-update" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.210522 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="82526f77-6157-4c1e-b14d-72e377c0971b" containerName="mariadb-database-create" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.211049 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dsgxp" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.220158 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.220529 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lcfzg" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.222459 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-dsgxp"] Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.330146 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g76pq\" (UniqueName: \"kubernetes.io/projected/af42085d-f7f5-4dd5-86d1-7019ba4d0888-kube-api-access-g76pq\") pod \"glance-db-sync-dsgxp\" (UID: \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\") " pod="openstack/glance-db-sync-dsgxp" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.330306 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-combined-ca-bundle\") pod \"glance-db-sync-dsgxp\" (UID: \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\") " pod="openstack/glance-db-sync-dsgxp" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.330530 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-config-data\") pod \"glance-db-sync-dsgxp\" (UID: \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\") " pod="openstack/glance-db-sync-dsgxp" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.330704 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-db-sync-config-data\") pod \"glance-db-sync-dsgxp\" (UID: \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\") " pod="openstack/glance-db-sync-dsgxp" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.432685 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g76pq\" (UniqueName: \"kubernetes.io/projected/af42085d-f7f5-4dd5-86d1-7019ba4d0888-kube-api-access-g76pq\") pod \"glance-db-sync-dsgxp\" (UID: \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\") " pod="openstack/glance-db-sync-dsgxp" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.432737 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-combined-ca-bundle\") pod \"glance-db-sync-dsgxp\" (UID: \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\") " pod="openstack/glance-db-sync-dsgxp" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.432763 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-config-data\") pod \"glance-db-sync-dsgxp\" (UID: \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\") " pod="openstack/glance-db-sync-dsgxp" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.432803 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-db-sync-config-data\") pod \"glance-db-sync-dsgxp\" (UID: \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\") " pod="openstack/glance-db-sync-dsgxp" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.436629 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-db-sync-config-data\") pod \"glance-db-sync-dsgxp\" (UID: \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\") " pod="openstack/glance-db-sync-dsgxp" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.438291 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-combined-ca-bundle\") pod \"glance-db-sync-dsgxp\" (UID: \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\") " pod="openstack/glance-db-sync-dsgxp" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.441270 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-config-data\") pod \"glance-db-sync-dsgxp\" (UID: \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\") " pod="openstack/glance-db-sync-dsgxp" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.462158 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g76pq\" (UniqueName: \"kubernetes.io/projected/af42085d-f7f5-4dd5-86d1-7019ba4d0888-kube-api-access-g76pq\") pod \"glance-db-sync-dsgxp\" (UID: \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\") " pod="openstack/glance-db-sync-dsgxp" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.531010 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dsgxp" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.781757 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee","Type":"ContainerStarted","Data":"964ed81a92ab2e00f935f8233cdd96fe1198bb0f098bef4e1e7daf63fe3fafa1"} Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.782788 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.798893 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18b127df-3095-45b6-b347-f1906d6317fe","Type":"ContainerStarted","Data":"d54acd77c825e39f901c7a84f0fea427dc00a70e6b3379e2e971c9d421094d8b"} Dec 05 20:23:27 crc kubenswrapper[4885]: I1205 20:23:27.811960 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=48.895887129 podStartE2EDuration="56.811943536s" podCreationTimestamp="2025-12-05 20:22:31 +0000 UTC" firstStartedPulling="2025-12-05 20:22:43.686841986 +0000 UTC m=+1028.983657647" lastFinishedPulling="2025-12-05 20:22:51.602898373 +0000 UTC m=+1036.899714054" observedRunningTime="2025-12-05 20:23:27.80856318 +0000 UTC m=+1073.105378871" watchObservedRunningTime="2025-12-05 20:23:27.811943536 +0000 UTC m=+1073.108759197" Dec 05 20:23:28 crc kubenswrapper[4885]: W1205 20:23:28.109656 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf42085d_f7f5_4dd5_86d1_7019ba4d0888.slice/crio-58141926f1a496953a02fd8d1d9e6a765f0192215c9c05830813496dd73d5f1f WatchSource:0}: Error finding container 58141926f1a496953a02fd8d1d9e6a765f0192215c9c05830813496dd73d5f1f: Status 404 returned error can't find the container with id 58141926f1a496953a02fd8d1d9e6a765f0192215c9c05830813496dd73d5f1f Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.110125 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-dsgxp"] Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.565997 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ptwvl" podUID="0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99" containerName="ovn-controller" probeResult="failure" output=< Dec 05 20:23:28 crc kubenswrapper[4885]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 20:23:28 crc kubenswrapper[4885]: > Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.624423 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.629653 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hgth4" Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.815278 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18b127df-3095-45b6-b347-f1906d6317fe","Type":"ContainerStarted","Data":"0eb39ea987183f18009823b65e8ee039268daa02694cd1a31e3985edc308558b"} Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.815325 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18b127df-3095-45b6-b347-f1906d6317fe","Type":"ContainerStarted","Data":"12975acc14cf1a4430e40c46b04735023e2ecd377269044884a55449b820d712"} Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.815337 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18b127df-3095-45b6-b347-f1906d6317fe","Type":"ContainerStarted","Data":"a3ff73a722d85923f9a780a408bf293e6942b45bafdeb8827dfd21187badf19d"} Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.817775 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dsgxp" event={"ID":"af42085d-f7f5-4dd5-86d1-7019ba4d0888","Type":"ContainerStarted","Data":"58141926f1a496953a02fd8d1d9e6a765f0192215c9c05830813496dd73d5f1f"} Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.846293 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ptwvl-config-5dqwz"] Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.847265 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.861624 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.867668 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ptwvl-config-5dqwz"] Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.962094 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-additional-scripts\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.962155 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-run-ovn\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.962191 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c9tk\" (UniqueName: \"kubernetes.io/projected/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-kube-api-access-5c9tk\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.962244 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-run\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.962259 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-scripts\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:28 crc kubenswrapper[4885]: I1205 20:23:28.962299 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-log-ovn\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.063461 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-additional-scripts\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.063529 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-run-ovn\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.063563 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c9tk\" (UniqueName: \"kubernetes.io/projected/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-kube-api-access-5c9tk\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.063603 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-run\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.063619 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-scripts\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.063652 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-log-ovn\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.063831 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-log-ovn\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.063834 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-run-ovn\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.063885 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-run\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.064180 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-additional-scripts\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.065647 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-scripts\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.085238 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c9tk\" (UniqueName: \"kubernetes.io/projected/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-kube-api-access-5c9tk\") pod \"ovn-controller-ptwvl-config-5dqwz\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.172250 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.633278 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ptwvl-config-5dqwz"] Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.828030 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ptwvl-config-5dqwz" event={"ID":"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8","Type":"ContainerStarted","Data":"56d288fa62f1eeb1da79efea4e154b52a8f4a2994e2ed19469b57d1b54b3d300"} Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.829510 4885 generic.go:334] "Generic (PLEG): container finished" podID="a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" containerID="cb7951a010b1fcef7bbdf48a7626b008b3385f640484db6e4f060850b13a3016" exitCode=0 Dec 05 20:23:29 crc kubenswrapper[4885]: I1205 20:23:29.829560 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68","Type":"ContainerDied","Data":"cb7951a010b1fcef7bbdf48a7626b008b3385f640484db6e4f060850b13a3016"} Dec 05 20:23:30 crc kubenswrapper[4885]: I1205 20:23:30.853932 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18b127df-3095-45b6-b347-f1906d6317fe","Type":"ContainerStarted","Data":"0c2487c7b234404a8b40011ed9784f9c4213fb46e044d917ba6d25f825272ad5"} Dec 05 20:23:30 crc kubenswrapper[4885]: I1205 20:23:30.854363 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18b127df-3095-45b6-b347-f1906d6317fe","Type":"ContainerStarted","Data":"1949d4542d3ba2c41a4dddfc089a3f8197231badb8440032b0d699ef45b97c08"} Dec 05 20:23:30 crc kubenswrapper[4885]: I1205 20:23:30.854382 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18b127df-3095-45b6-b347-f1906d6317fe","Type":"ContainerStarted","Data":"0e12cb9af909b3de9ca6623cd65ef61b47d25486719318d65cf7363fdad89c07"} Dec 05 20:23:30 crc kubenswrapper[4885]: I1205 20:23:30.854409 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18b127df-3095-45b6-b347-f1906d6317fe","Type":"ContainerStarted","Data":"1e3f68561ca56a959d5164f3fdf5eecf98b8dc55ee9143ab1139699ff8cf3c72"} Dec 05 20:23:30 crc kubenswrapper[4885]: I1205 20:23:30.858036 4885 generic.go:334] "Generic (PLEG): container finished" podID="6eff2375-9e4b-49c3-8b13-dc7b8cec33c8" containerID="f25a0fc00444ba0cebd20b21f60f8abe3a689a707ee249c082905f312d12a095" exitCode=0 Dec 05 20:23:30 crc kubenswrapper[4885]: I1205 20:23:30.858152 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ptwvl-config-5dqwz" event={"ID":"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8","Type":"ContainerDied","Data":"f25a0fc00444ba0cebd20b21f60f8abe3a689a707ee249c082905f312d12a095"} Dec 05 20:23:30 crc kubenswrapper[4885]: I1205 20:23:30.862200 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68","Type":"ContainerStarted","Data":"f778b64021c2d0e5e586038ad9bde7b16c2ba89dcabd1bd490cedc2a212d1f2c"} Dec 05 20:23:30 crc kubenswrapper[4885]: I1205 20:23:30.864726 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:23:30 crc kubenswrapper[4885]: I1205 20:23:30.920812 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=51.378065931 podStartE2EDuration="58.920778071s" podCreationTimestamp="2025-12-05 20:22:32 +0000 UTC" firstStartedPulling="2025-12-05 20:22:43.661950969 +0000 UTC m=+1028.958766630" lastFinishedPulling="2025-12-05 20:22:51.204663099 +0000 UTC m=+1036.501478770" observedRunningTime="2025-12-05 20:23:30.913624406 +0000 UTC m=+1076.210440077" watchObservedRunningTime="2025-12-05 20:23:30.920778071 +0000 UTC m=+1076.217593732" Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.288427 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.387971 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-run\") pod \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.388070 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-scripts\") pod \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.388105 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-log-ovn\") pod \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.388141 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-run-ovn\") pod \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.388201 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-additional-scripts\") pod \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.388233 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c9tk\" (UniqueName: \"kubernetes.io/projected/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-kube-api-access-5c9tk\") pod \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\" (UID: \"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8\") " Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.388848 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-run" (OuterVolumeSpecName: "var-run") pod "6eff2375-9e4b-49c3-8b13-dc7b8cec33c8" (UID: "6eff2375-9e4b-49c3-8b13-dc7b8cec33c8"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.389835 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-scripts" (OuterVolumeSpecName: "scripts") pod "6eff2375-9e4b-49c3-8b13-dc7b8cec33c8" (UID: "6eff2375-9e4b-49c3-8b13-dc7b8cec33c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.389904 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6eff2375-9e4b-49c3-8b13-dc7b8cec33c8" (UID: "6eff2375-9e4b-49c3-8b13-dc7b8cec33c8"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.389963 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6eff2375-9e4b-49c3-8b13-dc7b8cec33c8" (UID: "6eff2375-9e4b-49c3-8b13-dc7b8cec33c8"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.390439 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "6eff2375-9e4b-49c3-8b13-dc7b8cec33c8" (UID: "6eff2375-9e4b-49c3-8b13-dc7b8cec33c8"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.394456 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-kube-api-access-5c9tk" (OuterVolumeSpecName: "kube-api-access-5c9tk") pod "6eff2375-9e4b-49c3-8b13-dc7b8cec33c8" (UID: "6eff2375-9e4b-49c3-8b13-dc7b8cec33c8"). InnerVolumeSpecName "kube-api-access-5c9tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.490515 4885 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.490548 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c9tk\" (UniqueName: \"kubernetes.io/projected/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-kube-api-access-5c9tk\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.490560 4885 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.490569 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.490580 4885 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.490589 4885 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.892877 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18b127df-3095-45b6-b347-f1906d6317fe","Type":"ContainerStarted","Data":"6432a3b4b2a89d840a7cfa3979ff736abbba337b64d4a14226a42cad4138fce5"} Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.893265 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18b127df-3095-45b6-b347-f1906d6317fe","Type":"ContainerStarted","Data":"1b8eedda04a5d7358602b5dc84db03ec40559a9500b6defb3609f39b92b39b9c"} Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.893278 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18b127df-3095-45b6-b347-f1906d6317fe","Type":"ContainerStarted","Data":"a26bb2d213b88b63326bb1b6f74170ba0f4510060e869ae118249d6730d9297e"} Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.893288 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18b127df-3095-45b6-b347-f1906d6317fe","Type":"ContainerStarted","Data":"6644ee45e207331bae47ccf161efe42409d848002d5bae9bc177b5424ed0defa"} Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.893296 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18b127df-3095-45b6-b347-f1906d6317fe","Type":"ContainerStarted","Data":"da539b33b93c6b45cf3fc338854459612a755334038c0ae3e286eafec59f425c"} Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.896950 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ptwvl-config-5dqwz" event={"ID":"6eff2375-9e4b-49c3-8b13-dc7b8cec33c8","Type":"ContainerDied","Data":"56d288fa62f1eeb1da79efea4e154b52a8f4a2994e2ed19469b57d1b54b3d300"} Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.896974 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56d288fa62f1eeb1da79efea4e154b52a8f4a2994e2ed19469b57d1b54b3d300" Dec 05 20:23:32 crc kubenswrapper[4885]: I1205 20:23:32.897041 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ptwvl-config-5dqwz" Dec 05 20:23:33 crc kubenswrapper[4885]: I1205 20:23:33.404429 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ptwvl-config-5dqwz"] Dec 05 20:23:33 crc kubenswrapper[4885]: I1205 20:23:33.410531 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ptwvl-config-5dqwz"] Dec 05 20:23:33 crc kubenswrapper[4885]: I1205 20:23:33.580923 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ptwvl" Dec 05 20:23:33 crc kubenswrapper[4885]: I1205 20:23:33.913191 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18b127df-3095-45b6-b347-f1906d6317fe","Type":"ContainerStarted","Data":"af8a1ea29ff1f7002c30e4960720f62ce6ae1189703edf3eacc35ee94da55823"} Dec 05 20:23:33 crc kubenswrapper[4885]: I1205 20:23:33.913237 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18b127df-3095-45b6-b347-f1906d6317fe","Type":"ContainerStarted","Data":"87e92f275bed50ecc5e50fe2eacca5894fc7bcab68177b16206cecaabb780578"} Dec 05 20:23:33 crc kubenswrapper[4885]: I1205 20:23:33.953882 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.6539659 podStartE2EDuration="25.953863834s" podCreationTimestamp="2025-12-05 20:23:08 +0000 UTC" firstStartedPulling="2025-12-05 20:23:26.356897927 +0000 UTC m=+1071.653713598" lastFinishedPulling="2025-12-05 20:23:31.656795871 +0000 UTC m=+1076.953611532" observedRunningTime="2025-12-05 20:23:33.941825728 +0000 UTC m=+1079.238641399" watchObservedRunningTime="2025-12-05 20:23:33.953863834 +0000 UTC m=+1079.250679485" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.221233 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d5cc849d9-vffwb"] Dec 05 20:23:34 crc kubenswrapper[4885]: E1205 20:23:34.221628 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eff2375-9e4b-49c3-8b13-dc7b8cec33c8" containerName="ovn-config" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.221648 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eff2375-9e4b-49c3-8b13-dc7b8cec33c8" containerName="ovn-config" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.221863 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eff2375-9e4b-49c3-8b13-dc7b8cec33c8" containerName="ovn-config" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.222861 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.226709 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.244717 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d5cc849d9-vffwb"] Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.333298 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-dns-swift-storage-0\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.333393 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-dns-svc\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.333473 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.333502 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-config\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.333546 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjm5j\" (UniqueName: \"kubernetes.io/projected/e4751187-c98f-4fb5-aba4-63b0f8715b69-kube-api-access-bjm5j\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.333568 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.465608 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjm5j\" (UniqueName: \"kubernetes.io/projected/e4751187-c98f-4fb5-aba4-63b0f8715b69-kube-api-access-bjm5j\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.465650 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.465690 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-dns-swift-storage-0\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.465729 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-dns-svc\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.465789 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.465808 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-config\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.468458 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-config\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.468535 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.468584 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-dns-swift-storage-0\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.468557 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-dns-svc\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.468676 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.493208 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjm5j\" (UniqueName: \"kubernetes.io/projected/e4751187-c98f-4fb5-aba4-63b0f8715b69-kube-api-access-bjm5j\") pod \"dnsmasq-dns-7d5cc849d9-vffwb\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:34 crc kubenswrapper[4885]: I1205 20:23:34.539407 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:35 crc kubenswrapper[4885]: I1205 20:23:35.185305 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eff2375-9e4b-49c3-8b13-dc7b8cec33c8" path="/var/lib/kubelet/pods/6eff2375-9e4b-49c3-8b13-dc7b8cec33c8/volumes" Dec 05 20:23:43 crc kubenswrapper[4885]: I1205 20:23:43.592285 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 20:23:43 crc kubenswrapper[4885]: I1205 20:23:43.864838 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qg5dj"] Dec 05 20:23:43 crc kubenswrapper[4885]: I1205 20:23:43.866124 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qg5dj" Dec 05 20:23:43 crc kubenswrapper[4885]: I1205 20:23:43.873739 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qg5dj"] Dec 05 20:23:43 crc kubenswrapper[4885]: I1205 20:23:43.879184 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:23:43 crc kubenswrapper[4885]: I1205 20:23:43.963593 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-9ph7x"] Dec 05 20:23:43 crc kubenswrapper[4885]: I1205 20:23:43.964750 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9ph7x" Dec 05 20:23:43 crc kubenswrapper[4885]: I1205 20:23:43.972939 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-486b-account-create-update-8dkrv"] Dec 05 20:23:43 crc kubenswrapper[4885]: I1205 20:23:43.974413 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-486b-account-create-update-8dkrv" Dec 05 20:23:43 crc kubenswrapper[4885]: I1205 20:23:43.975857 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 05 20:23:43 crc kubenswrapper[4885]: I1205 20:23:43.979797 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-9ph7x"] Dec 05 20:23:43 crc kubenswrapper[4885]: I1205 20:23:43.986248 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-486b-account-create-update-8dkrv"] Dec 05 20:23:43 crc kubenswrapper[4885]: I1205 20:23:43.997366 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd-operator-scripts\") pod \"cinder-db-create-qg5dj\" (UID: \"f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd\") " pod="openstack/cinder-db-create-qg5dj" Dec 05 20:23:43 crc kubenswrapper[4885]: I1205 20:23:43.997545 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqmgq\" (UniqueName: \"kubernetes.io/projected/f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd-kube-api-access-lqmgq\") pod \"cinder-db-create-qg5dj\" (UID: \"f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd\") " pod="openstack/cinder-db-create-qg5dj" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.077516 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-63fc-account-create-update-zvpzs"] Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.078500 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-63fc-account-create-update-zvpzs" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.081245 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.091372 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-63fc-account-create-update-zvpzs"] Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.098840 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn4hw\" (UniqueName: \"kubernetes.io/projected/fcfdb1c5-4c20-42eb-9a6e-e8716d226881-kube-api-access-mn4hw\") pod \"cinder-486b-account-create-update-8dkrv\" (UID: \"fcfdb1c5-4c20-42eb-9a6e-e8716d226881\") " pod="openstack/cinder-486b-account-create-update-8dkrv" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.098931 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f6dp\" (UniqueName: \"kubernetes.io/projected/3d946273-15f3-46e4-a64e-7fb5cbcce090-kube-api-access-7f6dp\") pod \"barbican-db-create-9ph7x\" (UID: \"3d946273-15f3-46e4-a64e-7fb5cbcce090\") " pod="openstack/barbican-db-create-9ph7x" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.098971 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd-operator-scripts\") pod \"cinder-db-create-qg5dj\" (UID: \"f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd\") " pod="openstack/cinder-db-create-qg5dj" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.099036 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d946273-15f3-46e4-a64e-7fb5cbcce090-operator-scripts\") pod \"barbican-db-create-9ph7x\" (UID: \"3d946273-15f3-46e4-a64e-7fb5cbcce090\") " pod="openstack/barbican-db-create-9ph7x" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.099079 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqmgq\" (UniqueName: \"kubernetes.io/projected/f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd-kube-api-access-lqmgq\") pod \"cinder-db-create-qg5dj\" (UID: \"f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd\") " pod="openstack/cinder-db-create-qg5dj" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.099097 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcfdb1c5-4c20-42eb-9a6e-e8716d226881-operator-scripts\") pod \"cinder-486b-account-create-update-8dkrv\" (UID: \"fcfdb1c5-4c20-42eb-9a6e-e8716d226881\") " pod="openstack/cinder-486b-account-create-update-8dkrv" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.099878 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd-operator-scripts\") pod \"cinder-db-create-qg5dj\" (UID: \"f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd\") " pod="openstack/cinder-db-create-qg5dj" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.117401 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqmgq\" (UniqueName: \"kubernetes.io/projected/f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd-kube-api-access-lqmgq\") pod \"cinder-db-create-qg5dj\" (UID: \"f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd\") " pod="openstack/cinder-db-create-qg5dj" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.190692 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qg5dj" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.200105 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/077192b6-b7a8-4da8-b840-8486e927178f-operator-scripts\") pod \"barbican-63fc-account-create-update-zvpzs\" (UID: \"077192b6-b7a8-4da8-b840-8486e927178f\") " pod="openstack/barbican-63fc-account-create-update-zvpzs" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.200533 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d946273-15f3-46e4-a64e-7fb5cbcce090-operator-scripts\") pod \"barbican-db-create-9ph7x\" (UID: \"3d946273-15f3-46e4-a64e-7fb5cbcce090\") " pod="openstack/barbican-db-create-9ph7x" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.200596 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcfdb1c5-4c20-42eb-9a6e-e8716d226881-operator-scripts\") pod \"cinder-486b-account-create-update-8dkrv\" (UID: \"fcfdb1c5-4c20-42eb-9a6e-e8716d226881\") " pod="openstack/cinder-486b-account-create-update-8dkrv" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.200666 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn4hw\" (UniqueName: \"kubernetes.io/projected/fcfdb1c5-4c20-42eb-9a6e-e8716d226881-kube-api-access-mn4hw\") pod \"cinder-486b-account-create-update-8dkrv\" (UID: \"fcfdb1c5-4c20-42eb-9a6e-e8716d226881\") " pod="openstack/cinder-486b-account-create-update-8dkrv" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.200752 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f6dp\" (UniqueName: \"kubernetes.io/projected/3d946273-15f3-46e4-a64e-7fb5cbcce090-kube-api-access-7f6dp\") pod \"barbican-db-create-9ph7x\" (UID: \"3d946273-15f3-46e4-a64e-7fb5cbcce090\") " pod="openstack/barbican-db-create-9ph7x" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.200824 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tpbt\" (UniqueName: \"kubernetes.io/projected/077192b6-b7a8-4da8-b840-8486e927178f-kube-api-access-9tpbt\") pod \"barbican-63fc-account-create-update-zvpzs\" (UID: \"077192b6-b7a8-4da8-b840-8486e927178f\") " pod="openstack/barbican-63fc-account-create-update-zvpzs" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.201394 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d946273-15f3-46e4-a64e-7fb5cbcce090-operator-scripts\") pod \"barbican-db-create-9ph7x\" (UID: \"3d946273-15f3-46e4-a64e-7fb5cbcce090\") " pod="openstack/barbican-db-create-9ph7x" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.201511 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcfdb1c5-4c20-42eb-9a6e-e8716d226881-operator-scripts\") pod \"cinder-486b-account-create-update-8dkrv\" (UID: \"fcfdb1c5-4c20-42eb-9a6e-e8716d226881\") " pod="openstack/cinder-486b-account-create-update-8dkrv" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.209400 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-hdlzl"] Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.210487 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hdlzl" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.212281 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.212815 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.212818 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-c6lcf" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.214573 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.221836 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f6dp\" (UniqueName: \"kubernetes.io/projected/3d946273-15f3-46e4-a64e-7fb5cbcce090-kube-api-access-7f6dp\") pod \"barbican-db-create-9ph7x\" (UID: \"3d946273-15f3-46e4-a64e-7fb5cbcce090\") " pod="openstack/barbican-db-create-9ph7x" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.229086 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hdlzl"] Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.233147 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn4hw\" (UniqueName: \"kubernetes.io/projected/fcfdb1c5-4c20-42eb-9a6e-e8716d226881-kube-api-access-mn4hw\") pod \"cinder-486b-account-create-update-8dkrv\" (UID: \"fcfdb1c5-4c20-42eb-9a6e-e8716d226881\") " pod="openstack/cinder-486b-account-create-update-8dkrv" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.266746 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-b7z2f"] Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.268095 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-b7z2f" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.276941 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-b7z2f"] Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.280721 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9ph7x" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.302511 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-486b-account-create-update-8dkrv" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.305485 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tpbt\" (UniqueName: \"kubernetes.io/projected/077192b6-b7a8-4da8-b840-8486e927178f-kube-api-access-9tpbt\") pod \"barbican-63fc-account-create-update-zvpzs\" (UID: \"077192b6-b7a8-4da8-b840-8486e927178f\") " pod="openstack/barbican-63fc-account-create-update-zvpzs" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.305550 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483b86cb-8402-4f2d-8423-7f88ff0cc353-config-data\") pod \"keystone-db-sync-hdlzl\" (UID: \"483b86cb-8402-4f2d-8423-7f88ff0cc353\") " pod="openstack/keystone-db-sync-hdlzl" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.305593 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/077192b6-b7a8-4da8-b840-8486e927178f-operator-scripts\") pod \"barbican-63fc-account-create-update-zvpzs\" (UID: \"077192b6-b7a8-4da8-b840-8486e927178f\") " pod="openstack/barbican-63fc-account-create-update-zvpzs" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.305653 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483b86cb-8402-4f2d-8423-7f88ff0cc353-combined-ca-bundle\") pod \"keystone-db-sync-hdlzl\" (UID: \"483b86cb-8402-4f2d-8423-7f88ff0cc353\") " pod="openstack/keystone-db-sync-hdlzl" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.305801 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5chwc\" (UniqueName: \"kubernetes.io/projected/483b86cb-8402-4f2d-8423-7f88ff0cc353-kube-api-access-5chwc\") pod \"keystone-db-sync-hdlzl\" (UID: \"483b86cb-8402-4f2d-8423-7f88ff0cc353\") " pod="openstack/keystone-db-sync-hdlzl" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.307851 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/077192b6-b7a8-4da8-b840-8486e927178f-operator-scripts\") pod \"barbican-63fc-account-create-update-zvpzs\" (UID: \"077192b6-b7a8-4da8-b840-8486e927178f\") " pod="openstack/barbican-63fc-account-create-update-zvpzs" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.326797 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tpbt\" (UniqueName: \"kubernetes.io/projected/077192b6-b7a8-4da8-b840-8486e927178f-kube-api-access-9tpbt\") pod \"barbican-63fc-account-create-update-zvpzs\" (UID: \"077192b6-b7a8-4da8-b840-8486e927178f\") " pod="openstack/barbican-63fc-account-create-update-zvpzs" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.359092 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f105-account-create-update-d94w4"] Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.360164 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f105-account-create-update-d94w4" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.363317 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.367253 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f105-account-create-update-d94w4"] Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.396858 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-63fc-account-create-update-zvpzs" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.407567 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483b86cb-8402-4f2d-8423-7f88ff0cc353-config-data\") pod \"keystone-db-sync-hdlzl\" (UID: \"483b86cb-8402-4f2d-8423-7f88ff0cc353\") " pod="openstack/keystone-db-sync-hdlzl" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.407609 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483b86cb-8402-4f2d-8423-7f88ff0cc353-combined-ca-bundle\") pod \"keystone-db-sync-hdlzl\" (UID: \"483b86cb-8402-4f2d-8423-7f88ff0cc353\") " pod="openstack/keystone-db-sync-hdlzl" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.407666 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjntt\" (UniqueName: \"kubernetes.io/projected/0d27b3a6-f5ea-4e96-b5f5-22db1454767c-kube-api-access-gjntt\") pod \"neutron-db-create-b7z2f\" (UID: \"0d27b3a6-f5ea-4e96-b5f5-22db1454767c\") " pod="openstack/neutron-db-create-b7z2f" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.407690 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d27b3a6-f5ea-4e96-b5f5-22db1454767c-operator-scripts\") pod \"neutron-db-create-b7z2f\" (UID: \"0d27b3a6-f5ea-4e96-b5f5-22db1454767c\") " pod="openstack/neutron-db-create-b7z2f" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.407722 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5chwc\" (UniqueName: \"kubernetes.io/projected/483b86cb-8402-4f2d-8423-7f88ff0cc353-kube-api-access-5chwc\") pod \"keystone-db-sync-hdlzl\" (UID: \"483b86cb-8402-4f2d-8423-7f88ff0cc353\") " pod="openstack/keystone-db-sync-hdlzl" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.411660 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483b86cb-8402-4f2d-8423-7f88ff0cc353-config-data\") pod \"keystone-db-sync-hdlzl\" (UID: \"483b86cb-8402-4f2d-8423-7f88ff0cc353\") " pod="openstack/keystone-db-sync-hdlzl" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.413888 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483b86cb-8402-4f2d-8423-7f88ff0cc353-combined-ca-bundle\") pod \"keystone-db-sync-hdlzl\" (UID: \"483b86cb-8402-4f2d-8423-7f88ff0cc353\") " pod="openstack/keystone-db-sync-hdlzl" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.424345 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5chwc\" (UniqueName: \"kubernetes.io/projected/483b86cb-8402-4f2d-8423-7f88ff0cc353-kube-api-access-5chwc\") pod \"keystone-db-sync-hdlzl\" (UID: \"483b86cb-8402-4f2d-8423-7f88ff0cc353\") " pod="openstack/keystone-db-sync-hdlzl" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.508880 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a1b3cc-24e0-48ac-af60-1a740a0f6103-operator-scripts\") pod \"neutron-f105-account-create-update-d94w4\" (UID: \"34a1b3cc-24e0-48ac-af60-1a740a0f6103\") " pod="openstack/neutron-f105-account-create-update-d94w4" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.508974 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjntt\" (UniqueName: \"kubernetes.io/projected/0d27b3a6-f5ea-4e96-b5f5-22db1454767c-kube-api-access-gjntt\") pod \"neutron-db-create-b7z2f\" (UID: \"0d27b3a6-f5ea-4e96-b5f5-22db1454767c\") " pod="openstack/neutron-db-create-b7z2f" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.509001 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v9g2\" (UniqueName: \"kubernetes.io/projected/34a1b3cc-24e0-48ac-af60-1a740a0f6103-kube-api-access-7v9g2\") pod \"neutron-f105-account-create-update-d94w4\" (UID: \"34a1b3cc-24e0-48ac-af60-1a740a0f6103\") " pod="openstack/neutron-f105-account-create-update-d94w4" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.509031 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d27b3a6-f5ea-4e96-b5f5-22db1454767c-operator-scripts\") pod \"neutron-db-create-b7z2f\" (UID: \"0d27b3a6-f5ea-4e96-b5f5-22db1454767c\") " pod="openstack/neutron-db-create-b7z2f" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.509652 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d27b3a6-f5ea-4e96-b5f5-22db1454767c-operator-scripts\") pod \"neutron-db-create-b7z2f\" (UID: \"0d27b3a6-f5ea-4e96-b5f5-22db1454767c\") " pod="openstack/neutron-db-create-b7z2f" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.525191 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjntt\" (UniqueName: \"kubernetes.io/projected/0d27b3a6-f5ea-4e96-b5f5-22db1454767c-kube-api-access-gjntt\") pod \"neutron-db-create-b7z2f\" (UID: \"0d27b3a6-f5ea-4e96-b5f5-22db1454767c\") " pod="openstack/neutron-db-create-b7z2f" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.606909 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hdlzl" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.610680 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a1b3cc-24e0-48ac-af60-1a740a0f6103-operator-scripts\") pod \"neutron-f105-account-create-update-d94w4\" (UID: \"34a1b3cc-24e0-48ac-af60-1a740a0f6103\") " pod="openstack/neutron-f105-account-create-update-d94w4" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.610765 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v9g2\" (UniqueName: \"kubernetes.io/projected/34a1b3cc-24e0-48ac-af60-1a740a0f6103-kube-api-access-7v9g2\") pod \"neutron-f105-account-create-update-d94w4\" (UID: \"34a1b3cc-24e0-48ac-af60-1a740a0f6103\") " pod="openstack/neutron-f105-account-create-update-d94w4" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.611479 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a1b3cc-24e0-48ac-af60-1a740a0f6103-operator-scripts\") pod \"neutron-f105-account-create-update-d94w4\" (UID: \"34a1b3cc-24e0-48ac-af60-1a740a0f6103\") " pod="openstack/neutron-f105-account-create-update-d94w4" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.616352 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-b7z2f" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.636942 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v9g2\" (UniqueName: \"kubernetes.io/projected/34a1b3cc-24e0-48ac-af60-1a740a0f6103-kube-api-access-7v9g2\") pod \"neutron-f105-account-create-update-d94w4\" (UID: \"34a1b3cc-24e0-48ac-af60-1a740a0f6103\") " pod="openstack/neutron-f105-account-create-update-d94w4" Dec 05 20:23:44 crc kubenswrapper[4885]: I1205 20:23:44.682519 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f105-account-create-update-d94w4" Dec 05 20:23:46 crc kubenswrapper[4885]: I1205 20:23:46.631638 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:23:46 crc kubenswrapper[4885]: I1205 20:23:46.632509 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:23:46 crc kubenswrapper[4885]: E1205 20:23:46.799520 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63" Dec 05 20:23:46 crc kubenswrapper[4885]: E1205 20:23:46.799678 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g76pq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-dsgxp_openstack(af42085d-f7f5-4dd5-86d1-7019ba4d0888): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:23:46 crc kubenswrapper[4885]: E1205 20:23:46.811123 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-dsgxp" podUID="af42085d-f7f5-4dd5-86d1-7019ba4d0888" Dec 05 20:23:47 crc kubenswrapper[4885]: E1205 20:23:47.045614 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63\\\"\"" pod="openstack/glance-db-sync-dsgxp" podUID="af42085d-f7f5-4dd5-86d1-7019ba4d0888" Dec 05 20:23:47 crc kubenswrapper[4885]: I1205 20:23:47.227258 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-63fc-account-create-update-zvpzs"] Dec 05 20:23:47 crc kubenswrapper[4885]: I1205 20:23:47.234517 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d5cc849d9-vffwb"] Dec 05 20:23:47 crc kubenswrapper[4885]: I1205 20:23:47.242076 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qg5dj"] Dec 05 20:23:47 crc kubenswrapper[4885]: I1205 20:23:47.461910 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-486b-account-create-update-8dkrv"] Dec 05 20:23:47 crc kubenswrapper[4885]: I1205 20:23:47.479140 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f105-account-create-update-d94w4"] Dec 05 20:23:47 crc kubenswrapper[4885]: W1205 20:23:47.505531 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34a1b3cc_24e0_48ac_af60_1a740a0f6103.slice/crio-41cc70b1e90fe14e44198540e761ced0bd90cef054cebf7a5d7939dc975bad36 WatchSource:0}: Error finding container 41cc70b1e90fe14e44198540e761ced0bd90cef054cebf7a5d7939dc975bad36: Status 404 returned error can't find the container with id 41cc70b1e90fe14e44198540e761ced0bd90cef054cebf7a5d7939dc975bad36 Dec 05 20:23:47 crc kubenswrapper[4885]: I1205 20:23:47.507186 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-9ph7x"] Dec 05 20:23:47 crc kubenswrapper[4885]: W1205 20:23:47.514307 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod483b86cb_8402_4f2d_8423_7f88ff0cc353.slice/crio-475c8eecbc576b507c7bdade42eeb2e9b081391ad85ff1d41e3dfe1167a9bd2c WatchSource:0}: Error finding container 475c8eecbc576b507c7bdade42eeb2e9b081391ad85ff1d41e3dfe1167a9bd2c: Status 404 returned error can't find the container with id 475c8eecbc576b507c7bdade42eeb2e9b081391ad85ff1d41e3dfe1167a9bd2c Dec 05 20:23:47 crc kubenswrapper[4885]: I1205 20:23:47.522544 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hdlzl"] Dec 05 20:23:47 crc kubenswrapper[4885]: I1205 20:23:47.528580 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-b7z2f"] Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.057010 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-486b-account-create-update-8dkrv" event={"ID":"fcfdb1c5-4c20-42eb-9a6e-e8716d226881","Type":"ContainerStarted","Data":"1e88419f74b85dfab715e45192413e6fceb421fed02931ab59ff4298c6cd7220"} Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.057418 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-486b-account-create-update-8dkrv" event={"ID":"fcfdb1c5-4c20-42eb-9a6e-e8716d226881","Type":"ContainerStarted","Data":"09262bded41116e8381ff79816b3a9c2c1b88e94bbb9d74a8b8ee421c389c493"} Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.059533 4885 generic.go:334] "Generic (PLEG): container finished" podID="e4751187-c98f-4fb5-aba4-63b0f8715b69" containerID="334a3762850280b82410ae5a389ee42f5c490420e8349e4d6ee4895f97252236" exitCode=0 Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.059627 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" event={"ID":"e4751187-c98f-4fb5-aba4-63b0f8715b69","Type":"ContainerDied","Data":"334a3762850280b82410ae5a389ee42f5c490420e8349e4d6ee4895f97252236"} Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.059671 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" event={"ID":"e4751187-c98f-4fb5-aba4-63b0f8715b69","Type":"ContainerStarted","Data":"c4c2b616138287261b3177cadd03dd721baebf46c6f56f2dbc7756903cd37daf"} Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.063003 4885 generic.go:334] "Generic (PLEG): container finished" podID="077192b6-b7a8-4da8-b840-8486e927178f" containerID="5f395e573889f524a1a01c75a51e6089a32fa2e080d8e064d0d7d5d3cf76136c" exitCode=0 Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.063100 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-63fc-account-create-update-zvpzs" event={"ID":"077192b6-b7a8-4da8-b840-8486e927178f","Type":"ContainerDied","Data":"5f395e573889f524a1a01c75a51e6089a32fa2e080d8e064d0d7d5d3cf76136c"} Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.063128 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-63fc-account-create-update-zvpzs" event={"ID":"077192b6-b7a8-4da8-b840-8486e927178f","Type":"ContainerStarted","Data":"1b0d4a93551e1b3e5f016a13cbbb873ab5ea2d245e677d7ba762cb39b8d52c14"} Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.066196 4885 generic.go:334] "Generic (PLEG): container finished" podID="f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd" containerID="ddc8cb6070ddda4869cedd5218287eeae217a7f0ed5a190af26bffc8c30275bb" exitCode=0 Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.066279 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qg5dj" event={"ID":"f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd","Type":"ContainerDied","Data":"ddc8cb6070ddda4869cedd5218287eeae217a7f0ed5a190af26bffc8c30275bb"} Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.066313 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qg5dj" event={"ID":"f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd","Type":"ContainerStarted","Data":"046fd98b340fc316843c56531e3452589cbffcb06282dffa5beb6d7a62b5213f"} Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.069310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f105-account-create-update-d94w4" event={"ID":"34a1b3cc-24e0-48ac-af60-1a740a0f6103","Type":"ContainerStarted","Data":"c27607964e29b261a360669f935dc6fe0b288a7336052682463c372d4703ac84"} Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.069362 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f105-account-create-update-d94w4" event={"ID":"34a1b3cc-24e0-48ac-af60-1a740a0f6103","Type":"ContainerStarted","Data":"41cc70b1e90fe14e44198540e761ced0bd90cef054cebf7a5d7939dc975bad36"} Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.075186 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-486b-account-create-update-8dkrv" podStartSLOduration=5.075165764 podStartE2EDuration="5.075165764s" podCreationTimestamp="2025-12-05 20:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:23:48.069497626 +0000 UTC m=+1093.366313287" watchObservedRunningTime="2025-12-05 20:23:48.075165764 +0000 UTC m=+1093.371981455" Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.081359 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9ph7x" event={"ID":"3d946273-15f3-46e4-a64e-7fb5cbcce090","Type":"ContainerStarted","Data":"ac155aa072a92cc10d6f357a15f70afb1d63ad6062ef817174bc0d33e02e88d8"} Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.081417 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9ph7x" event={"ID":"3d946273-15f3-46e4-a64e-7fb5cbcce090","Type":"ContainerStarted","Data":"3459c30c2a7d4a4133cc1a726fa1397075250ea6331f6a66310a7db269406da4"} Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.085352 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-b7z2f" event={"ID":"0d27b3a6-f5ea-4e96-b5f5-22db1454767c","Type":"ContainerStarted","Data":"d2fd1b45172063e39a70ee6b2dd01a27495fcbf2190a1a169e47906834c8ddc0"} Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.085398 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-b7z2f" event={"ID":"0d27b3a6-f5ea-4e96-b5f5-22db1454767c","Type":"ContainerStarted","Data":"bb39bbbfe17b52886189520e9e2f427b613b03846bdcc6ac08b86c988b97e2dd"} Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.091686 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hdlzl" event={"ID":"483b86cb-8402-4f2d-8423-7f88ff0cc353","Type":"ContainerStarted","Data":"475c8eecbc576b507c7bdade42eeb2e9b081391ad85ff1d41e3dfe1167a9bd2c"} Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.110584 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f105-account-create-update-d94w4" podStartSLOduration=4.110563111 podStartE2EDuration="4.110563111s" podCreationTimestamp="2025-12-05 20:23:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:23:48.10254729 +0000 UTC m=+1093.399362981" watchObservedRunningTime="2025-12-05 20:23:48.110563111 +0000 UTC m=+1093.407378782" Dec 05 20:23:48 crc kubenswrapper[4885]: I1205 20:23:48.167666 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-9ph7x" podStartSLOduration=5.167651146 podStartE2EDuration="5.167651146s" podCreationTimestamp="2025-12-05 20:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:23:48.158464609 +0000 UTC m=+1093.455280270" watchObservedRunningTime="2025-12-05 20:23:48.167651146 +0000 UTC m=+1093.464466807" Dec 05 20:23:49 crc kubenswrapper[4885]: I1205 20:23:49.126161 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" event={"ID":"e4751187-c98f-4fb5-aba4-63b0f8715b69","Type":"ContainerStarted","Data":"6b716fb752c626d30def76af4bab6711fd693e2a61883d08d65fc1e7b87a2df6"} Dec 05 20:23:49 crc kubenswrapper[4885]: I1205 20:23:49.130185 4885 generic.go:334] "Generic (PLEG): container finished" podID="34a1b3cc-24e0-48ac-af60-1a740a0f6103" containerID="c27607964e29b261a360669f935dc6fe0b288a7336052682463c372d4703ac84" exitCode=0 Dec 05 20:23:49 crc kubenswrapper[4885]: I1205 20:23:49.130273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f105-account-create-update-d94w4" event={"ID":"34a1b3cc-24e0-48ac-af60-1a740a0f6103","Type":"ContainerDied","Data":"c27607964e29b261a360669f935dc6fe0b288a7336052682463c372d4703ac84"} Dec 05 20:23:49 crc kubenswrapper[4885]: I1205 20:23:49.132311 4885 generic.go:334] "Generic (PLEG): container finished" podID="3d946273-15f3-46e4-a64e-7fb5cbcce090" containerID="ac155aa072a92cc10d6f357a15f70afb1d63ad6062ef817174bc0d33e02e88d8" exitCode=0 Dec 05 20:23:49 crc kubenswrapper[4885]: I1205 20:23:49.132339 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9ph7x" event={"ID":"3d946273-15f3-46e4-a64e-7fb5cbcce090","Type":"ContainerDied","Data":"ac155aa072a92cc10d6f357a15f70afb1d63ad6062ef817174bc0d33e02e88d8"} Dec 05 20:23:49 crc kubenswrapper[4885]: I1205 20:23:49.134521 4885 generic.go:334] "Generic (PLEG): container finished" podID="0d27b3a6-f5ea-4e96-b5f5-22db1454767c" containerID="d2fd1b45172063e39a70ee6b2dd01a27495fcbf2190a1a169e47906834c8ddc0" exitCode=0 Dec 05 20:23:49 crc kubenswrapper[4885]: I1205 20:23:49.134744 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-b7z2f" event={"ID":"0d27b3a6-f5ea-4e96-b5f5-22db1454767c","Type":"ContainerDied","Data":"d2fd1b45172063e39a70ee6b2dd01a27495fcbf2190a1a169e47906834c8ddc0"} Dec 05 20:23:49 crc kubenswrapper[4885]: I1205 20:23:49.138155 4885 generic.go:334] "Generic (PLEG): container finished" podID="fcfdb1c5-4c20-42eb-9a6e-e8716d226881" containerID="1e88419f74b85dfab715e45192413e6fceb421fed02931ab59ff4298c6cd7220" exitCode=0 Dec 05 20:23:49 crc kubenswrapper[4885]: I1205 20:23:49.138354 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-486b-account-create-update-8dkrv" event={"ID":"fcfdb1c5-4c20-42eb-9a6e-e8716d226881","Type":"ContainerDied","Data":"1e88419f74b85dfab715e45192413e6fceb421fed02931ab59ff4298c6cd7220"} Dec 05 20:23:49 crc kubenswrapper[4885]: I1205 20:23:49.156627 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" podStartSLOduration=15.156604098 podStartE2EDuration="15.156604098s" podCreationTimestamp="2025-12-05 20:23:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:23:49.145181871 +0000 UTC m=+1094.441997542" watchObservedRunningTime="2025-12-05 20:23:49.156604098 +0000 UTC m=+1094.453419779" Dec 05 20:23:49 crc kubenswrapper[4885]: I1205 20:23:49.541037 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.727889 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-63fc-account-create-update-zvpzs" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.735008 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f105-account-create-update-d94w4" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.780847 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-b7z2f" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.797733 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9ph7x" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.804256 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qg5dj" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.821865 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-486b-account-create-update-8dkrv" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.835262 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/077192b6-b7a8-4da8-b840-8486e927178f-operator-scripts\") pod \"077192b6-b7a8-4da8-b840-8486e927178f\" (UID: \"077192b6-b7a8-4da8-b840-8486e927178f\") " Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.835503 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a1b3cc-24e0-48ac-af60-1a740a0f6103-operator-scripts\") pod \"34a1b3cc-24e0-48ac-af60-1a740a0f6103\" (UID: \"34a1b3cc-24e0-48ac-af60-1a740a0f6103\") " Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.835555 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v9g2\" (UniqueName: \"kubernetes.io/projected/34a1b3cc-24e0-48ac-af60-1a740a0f6103-kube-api-access-7v9g2\") pod \"34a1b3cc-24e0-48ac-af60-1a740a0f6103\" (UID: \"34a1b3cc-24e0-48ac-af60-1a740a0f6103\") " Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.835651 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tpbt\" (UniqueName: \"kubernetes.io/projected/077192b6-b7a8-4da8-b840-8486e927178f-kube-api-access-9tpbt\") pod \"077192b6-b7a8-4da8-b840-8486e927178f\" (UID: \"077192b6-b7a8-4da8-b840-8486e927178f\") " Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.836857 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34a1b3cc-24e0-48ac-af60-1a740a0f6103-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "34a1b3cc-24e0-48ac-af60-1a740a0f6103" (UID: "34a1b3cc-24e0-48ac-af60-1a740a0f6103"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.836973 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/077192b6-b7a8-4da8-b840-8486e927178f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "077192b6-b7a8-4da8-b840-8486e927178f" (UID: "077192b6-b7a8-4da8-b840-8486e927178f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.848969 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34a1b3cc-24e0-48ac-af60-1a740a0f6103-kube-api-access-7v9g2" (OuterVolumeSpecName: "kube-api-access-7v9g2") pod "34a1b3cc-24e0-48ac-af60-1a740a0f6103" (UID: "34a1b3cc-24e0-48ac-af60-1a740a0f6103"). InnerVolumeSpecName "kube-api-access-7v9g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.859208 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/077192b6-b7a8-4da8-b840-8486e927178f-kube-api-access-9tpbt" (OuterVolumeSpecName: "kube-api-access-9tpbt") pod "077192b6-b7a8-4da8-b840-8486e927178f" (UID: "077192b6-b7a8-4da8-b840-8486e927178f"). InnerVolumeSpecName "kube-api-access-9tpbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.937523 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn4hw\" (UniqueName: \"kubernetes.io/projected/fcfdb1c5-4c20-42eb-9a6e-e8716d226881-kube-api-access-mn4hw\") pod \"fcfdb1c5-4c20-42eb-9a6e-e8716d226881\" (UID: \"fcfdb1c5-4c20-42eb-9a6e-e8716d226881\") " Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.937658 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd-operator-scripts\") pod \"f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd\" (UID: \"f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd\") " Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.937704 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqmgq\" (UniqueName: \"kubernetes.io/projected/f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd-kube-api-access-lqmgq\") pod \"f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd\" (UID: \"f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd\") " Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.937743 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f6dp\" (UniqueName: \"kubernetes.io/projected/3d946273-15f3-46e4-a64e-7fb5cbcce090-kube-api-access-7f6dp\") pod \"3d946273-15f3-46e4-a64e-7fb5cbcce090\" (UID: \"3d946273-15f3-46e4-a64e-7fb5cbcce090\") " Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.937881 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d946273-15f3-46e4-a64e-7fb5cbcce090-operator-scripts\") pod \"3d946273-15f3-46e4-a64e-7fb5cbcce090\" (UID: \"3d946273-15f3-46e4-a64e-7fb5cbcce090\") " Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.937924 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcfdb1c5-4c20-42eb-9a6e-e8716d226881-operator-scripts\") pod \"fcfdb1c5-4c20-42eb-9a6e-e8716d226881\" (UID: \"fcfdb1c5-4c20-42eb-9a6e-e8716d226881\") " Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.938049 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d27b3a6-f5ea-4e96-b5f5-22db1454767c-operator-scripts\") pod \"0d27b3a6-f5ea-4e96-b5f5-22db1454767c\" (UID: \"0d27b3a6-f5ea-4e96-b5f5-22db1454767c\") " Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.938173 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjntt\" (UniqueName: \"kubernetes.io/projected/0d27b3a6-f5ea-4e96-b5f5-22db1454767c-kube-api-access-gjntt\") pod \"0d27b3a6-f5ea-4e96-b5f5-22db1454767c\" (UID: \"0d27b3a6-f5ea-4e96-b5f5-22db1454767c\") " Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.938604 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcfdb1c5-4c20-42eb-9a6e-e8716d226881-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fcfdb1c5-4c20-42eb-9a6e-e8716d226881" (UID: "fcfdb1c5-4c20-42eb-9a6e-e8716d226881"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.938644 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d946273-15f3-46e4-a64e-7fb5cbcce090-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d946273-15f3-46e4-a64e-7fb5cbcce090" (UID: "3d946273-15f3-46e4-a64e-7fb5cbcce090"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.938676 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d27b3a6-f5ea-4e96-b5f5-22db1454767c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d27b3a6-f5ea-4e96-b5f5-22db1454767c" (UID: "0d27b3a6-f5ea-4e96-b5f5-22db1454767c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.939216 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tpbt\" (UniqueName: \"kubernetes.io/projected/077192b6-b7a8-4da8-b840-8486e927178f-kube-api-access-9tpbt\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.939237 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d27b3a6-f5ea-4e96-b5f5-22db1454767c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.939246 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/077192b6-b7a8-4da8-b840-8486e927178f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.939255 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a1b3cc-24e0-48ac-af60-1a740a0f6103-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.939264 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v9g2\" (UniqueName: \"kubernetes.io/projected/34a1b3cc-24e0-48ac-af60-1a740a0f6103-kube-api-access-7v9g2\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.939272 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d946273-15f3-46e4-a64e-7fb5cbcce090-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.939280 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcfdb1c5-4c20-42eb-9a6e-e8716d226881-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.939483 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd" (UID: "f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.942682 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d27b3a6-f5ea-4e96-b5f5-22db1454767c-kube-api-access-gjntt" (OuterVolumeSpecName: "kube-api-access-gjntt") pod "0d27b3a6-f5ea-4e96-b5f5-22db1454767c" (UID: "0d27b3a6-f5ea-4e96-b5f5-22db1454767c"). InnerVolumeSpecName "kube-api-access-gjntt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.943373 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcfdb1c5-4c20-42eb-9a6e-e8716d226881-kube-api-access-mn4hw" (OuterVolumeSpecName: "kube-api-access-mn4hw") pod "fcfdb1c5-4c20-42eb-9a6e-e8716d226881" (UID: "fcfdb1c5-4c20-42eb-9a6e-e8716d226881"). InnerVolumeSpecName "kube-api-access-mn4hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.943946 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d946273-15f3-46e4-a64e-7fb5cbcce090-kube-api-access-7f6dp" (OuterVolumeSpecName: "kube-api-access-7f6dp") pod "3d946273-15f3-46e4-a64e-7fb5cbcce090" (UID: "3d946273-15f3-46e4-a64e-7fb5cbcce090"). InnerVolumeSpecName "kube-api-access-7f6dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:51 crc kubenswrapper[4885]: I1205 20:23:51.944730 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd-kube-api-access-lqmgq" (OuterVolumeSpecName: "kube-api-access-lqmgq") pod "f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd" (UID: "f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd"). InnerVolumeSpecName "kube-api-access-lqmgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.040871 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.040924 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqmgq\" (UniqueName: \"kubernetes.io/projected/f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd-kube-api-access-lqmgq\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.040982 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f6dp\" (UniqueName: \"kubernetes.io/projected/3d946273-15f3-46e4-a64e-7fb5cbcce090-kube-api-access-7f6dp\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.041000 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjntt\" (UniqueName: \"kubernetes.io/projected/0d27b3a6-f5ea-4e96-b5f5-22db1454767c-kube-api-access-gjntt\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.041043 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn4hw\" (UniqueName: \"kubernetes.io/projected/fcfdb1c5-4c20-42eb-9a6e-e8716d226881-kube-api-access-mn4hw\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.166173 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-63fc-account-create-update-zvpzs" event={"ID":"077192b6-b7a8-4da8-b840-8486e927178f","Type":"ContainerDied","Data":"1b0d4a93551e1b3e5f016a13cbbb873ab5ea2d245e677d7ba762cb39b8d52c14"} Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.166209 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b0d4a93551e1b3e5f016a13cbbb873ab5ea2d245e677d7ba762cb39b8d52c14" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.166249 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-63fc-account-create-update-zvpzs" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.168524 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qg5dj" event={"ID":"f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd","Type":"ContainerDied","Data":"046fd98b340fc316843c56531e3452589cbffcb06282dffa5beb6d7a62b5213f"} Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.168580 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qg5dj" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.168929 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="046fd98b340fc316843c56531e3452589cbffcb06282dffa5beb6d7a62b5213f" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.170884 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f105-account-create-update-d94w4" event={"ID":"34a1b3cc-24e0-48ac-af60-1a740a0f6103","Type":"ContainerDied","Data":"41cc70b1e90fe14e44198540e761ced0bd90cef054cebf7a5d7939dc975bad36"} Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.170942 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41cc70b1e90fe14e44198540e761ced0bd90cef054cebf7a5d7939dc975bad36" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.171012 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f105-account-create-update-d94w4" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.179736 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9ph7x" event={"ID":"3d946273-15f3-46e4-a64e-7fb5cbcce090","Type":"ContainerDied","Data":"3459c30c2a7d4a4133cc1a726fa1397075250ea6331f6a66310a7db269406da4"} Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.179777 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3459c30c2a7d4a4133cc1a726fa1397075250ea6331f6a66310a7db269406da4" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.179831 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9ph7x" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.191090 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-b7z2f" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.191585 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-b7z2f" event={"ID":"0d27b3a6-f5ea-4e96-b5f5-22db1454767c","Type":"ContainerDied","Data":"bb39bbbfe17b52886189520e9e2f427b613b03846bdcc6ac08b86c988b97e2dd"} Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.191647 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb39bbbfe17b52886189520e9e2f427b613b03846bdcc6ac08b86c988b97e2dd" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.194096 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hdlzl" event={"ID":"483b86cb-8402-4f2d-8423-7f88ff0cc353","Type":"ContainerStarted","Data":"09c866db04b1255facc89243c66caaf1aa66013cccd0c1fd2f09d93f1c4462c8"} Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.201309 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-486b-account-create-update-8dkrv" event={"ID":"fcfdb1c5-4c20-42eb-9a6e-e8716d226881","Type":"ContainerDied","Data":"09262bded41116e8381ff79816b3a9c2c1b88e94bbb9d74a8b8ee421c389c493"} Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.201355 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09262bded41116e8381ff79816b3a9c2c1b88e94bbb9d74a8b8ee421c389c493" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.201404 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-486b-account-create-update-8dkrv" Dec 05 20:23:52 crc kubenswrapper[4885]: I1205 20:23:52.247646 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-hdlzl" podStartSLOduration=4.193507216 podStartE2EDuration="8.247620365s" podCreationTimestamp="2025-12-05 20:23:44 +0000 UTC" firstStartedPulling="2025-12-05 20:23:47.529550819 +0000 UTC m=+1092.826366480" lastFinishedPulling="2025-12-05 20:23:51.583663958 +0000 UTC m=+1096.880479629" observedRunningTime="2025-12-05 20:23:52.238502479 +0000 UTC m=+1097.535318140" watchObservedRunningTime="2025-12-05 20:23:52.247620365 +0000 UTC m=+1097.544436066" Dec 05 20:23:54 crc kubenswrapper[4885]: I1205 20:23:54.541459 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:23:54 crc kubenswrapper[4885]: I1205 20:23:54.611870 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784d65c867-xkgrp"] Dec 05 20:23:54 crc kubenswrapper[4885]: I1205 20:23:54.612444 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-784d65c867-xkgrp" podUID="92f8ad64-3f8e-462a-91ae-091750185877" containerName="dnsmasq-dns" containerID="cri-o://d3c0e86ab257c232b4a3bb1330d08496197fa91b31ae0ed4d06a8063472352f5" gracePeriod=10 Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.035166 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.198070 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-dns-svc\") pod \"92f8ad64-3f8e-462a-91ae-091750185877\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.198574 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-ovsdbserver-nb\") pod \"92f8ad64-3f8e-462a-91ae-091750185877\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.198673 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-config\") pod \"92f8ad64-3f8e-462a-91ae-091750185877\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.198724 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dd6p\" (UniqueName: \"kubernetes.io/projected/92f8ad64-3f8e-462a-91ae-091750185877-kube-api-access-9dd6p\") pod \"92f8ad64-3f8e-462a-91ae-091750185877\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.198761 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-ovsdbserver-sb\") pod \"92f8ad64-3f8e-462a-91ae-091750185877\" (UID: \"92f8ad64-3f8e-462a-91ae-091750185877\") " Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.203984 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f8ad64-3f8e-462a-91ae-091750185877-kube-api-access-9dd6p" (OuterVolumeSpecName: "kube-api-access-9dd6p") pod "92f8ad64-3f8e-462a-91ae-091750185877" (UID: "92f8ad64-3f8e-462a-91ae-091750185877"). InnerVolumeSpecName "kube-api-access-9dd6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.229631 4885 generic.go:334] "Generic (PLEG): container finished" podID="92f8ad64-3f8e-462a-91ae-091750185877" containerID="d3c0e86ab257c232b4a3bb1330d08496197fa91b31ae0ed4d06a8063472352f5" exitCode=0 Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.229717 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784d65c867-xkgrp" event={"ID":"92f8ad64-3f8e-462a-91ae-091750185877","Type":"ContainerDied","Data":"d3c0e86ab257c232b4a3bb1330d08496197fa91b31ae0ed4d06a8063472352f5"} Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.229760 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784d65c867-xkgrp" event={"ID":"92f8ad64-3f8e-462a-91ae-091750185877","Type":"ContainerDied","Data":"4d93e6acfd361e65bc0cfe4c2d8d2400439dd166902e700a67d0c542afaf5b5e"} Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.229786 4885 scope.go:117] "RemoveContainer" containerID="d3c0e86ab257c232b4a3bb1330d08496197fa91b31ae0ed4d06a8063472352f5" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.229942 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784d65c867-xkgrp" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.248158 4885 generic.go:334] "Generic (PLEG): container finished" podID="483b86cb-8402-4f2d-8423-7f88ff0cc353" containerID="09c866db04b1255facc89243c66caaf1aa66013cccd0c1fd2f09d93f1c4462c8" exitCode=0 Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.248227 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hdlzl" event={"ID":"483b86cb-8402-4f2d-8423-7f88ff0cc353","Type":"ContainerDied","Data":"09c866db04b1255facc89243c66caaf1aa66013cccd0c1fd2f09d93f1c4462c8"} Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.251257 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "92f8ad64-3f8e-462a-91ae-091750185877" (UID: "92f8ad64-3f8e-462a-91ae-091750185877"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.259610 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "92f8ad64-3f8e-462a-91ae-091750185877" (UID: "92f8ad64-3f8e-462a-91ae-091750185877"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.271041 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-config" (OuterVolumeSpecName: "config") pod "92f8ad64-3f8e-462a-91ae-091750185877" (UID: "92f8ad64-3f8e-462a-91ae-091750185877"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.278757 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "92f8ad64-3f8e-462a-91ae-091750185877" (UID: "92f8ad64-3f8e-462a-91ae-091750185877"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.288828 4885 scope.go:117] "RemoveContainer" containerID="b8889bab38c5da09490113c402d40377ed6f9d4d7ee561b2e00150e4e9cf6f74" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.300353 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.300388 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.300397 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dd6p\" (UniqueName: \"kubernetes.io/projected/92f8ad64-3f8e-462a-91ae-091750185877-kube-api-access-9dd6p\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.300406 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.300415 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92f8ad64-3f8e-462a-91ae-091750185877-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.312409 4885 scope.go:117] "RemoveContainer" containerID="d3c0e86ab257c232b4a3bb1330d08496197fa91b31ae0ed4d06a8063472352f5" Dec 05 20:23:55 crc kubenswrapper[4885]: E1205 20:23:55.312902 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3c0e86ab257c232b4a3bb1330d08496197fa91b31ae0ed4d06a8063472352f5\": container with ID starting with d3c0e86ab257c232b4a3bb1330d08496197fa91b31ae0ed4d06a8063472352f5 not found: ID does not exist" containerID="d3c0e86ab257c232b4a3bb1330d08496197fa91b31ae0ed4d06a8063472352f5" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.312938 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3c0e86ab257c232b4a3bb1330d08496197fa91b31ae0ed4d06a8063472352f5"} err="failed to get container status \"d3c0e86ab257c232b4a3bb1330d08496197fa91b31ae0ed4d06a8063472352f5\": rpc error: code = NotFound desc = could not find container \"d3c0e86ab257c232b4a3bb1330d08496197fa91b31ae0ed4d06a8063472352f5\": container with ID starting with d3c0e86ab257c232b4a3bb1330d08496197fa91b31ae0ed4d06a8063472352f5 not found: ID does not exist" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.312977 4885 scope.go:117] "RemoveContainer" containerID="b8889bab38c5da09490113c402d40377ed6f9d4d7ee561b2e00150e4e9cf6f74" Dec 05 20:23:55 crc kubenswrapper[4885]: E1205 20:23:55.313351 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8889bab38c5da09490113c402d40377ed6f9d4d7ee561b2e00150e4e9cf6f74\": container with ID starting with b8889bab38c5da09490113c402d40377ed6f9d4d7ee561b2e00150e4e9cf6f74 not found: ID does not exist" containerID="b8889bab38c5da09490113c402d40377ed6f9d4d7ee561b2e00150e4e9cf6f74" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.313403 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8889bab38c5da09490113c402d40377ed6f9d4d7ee561b2e00150e4e9cf6f74"} err="failed to get container status \"b8889bab38c5da09490113c402d40377ed6f9d4d7ee561b2e00150e4e9cf6f74\": rpc error: code = NotFound desc = could not find container \"b8889bab38c5da09490113c402d40377ed6f9d4d7ee561b2e00150e4e9cf6f74\": container with ID starting with b8889bab38c5da09490113c402d40377ed6f9d4d7ee561b2e00150e4e9cf6f74 not found: ID does not exist" Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.559459 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784d65c867-xkgrp"] Dec 05 20:23:55 crc kubenswrapper[4885]: I1205 20:23:55.566746 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-784d65c867-xkgrp"] Dec 05 20:23:56 crc kubenswrapper[4885]: I1205 20:23:56.567931 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hdlzl" Dec 05 20:23:56 crc kubenswrapper[4885]: I1205 20:23:56.722040 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483b86cb-8402-4f2d-8423-7f88ff0cc353-config-data\") pod \"483b86cb-8402-4f2d-8423-7f88ff0cc353\" (UID: \"483b86cb-8402-4f2d-8423-7f88ff0cc353\") " Dec 05 20:23:56 crc kubenswrapper[4885]: I1205 20:23:56.722439 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5chwc\" (UniqueName: \"kubernetes.io/projected/483b86cb-8402-4f2d-8423-7f88ff0cc353-kube-api-access-5chwc\") pod \"483b86cb-8402-4f2d-8423-7f88ff0cc353\" (UID: \"483b86cb-8402-4f2d-8423-7f88ff0cc353\") " Dec 05 20:23:56 crc kubenswrapper[4885]: I1205 20:23:56.722612 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483b86cb-8402-4f2d-8423-7f88ff0cc353-combined-ca-bundle\") pod \"483b86cb-8402-4f2d-8423-7f88ff0cc353\" (UID: \"483b86cb-8402-4f2d-8423-7f88ff0cc353\") " Dec 05 20:23:56 crc kubenswrapper[4885]: I1205 20:23:56.728321 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/483b86cb-8402-4f2d-8423-7f88ff0cc353-kube-api-access-5chwc" (OuterVolumeSpecName: "kube-api-access-5chwc") pod "483b86cb-8402-4f2d-8423-7f88ff0cc353" (UID: "483b86cb-8402-4f2d-8423-7f88ff0cc353"). InnerVolumeSpecName "kube-api-access-5chwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4885]: I1205 20:23:56.774455 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/483b86cb-8402-4f2d-8423-7f88ff0cc353-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "483b86cb-8402-4f2d-8423-7f88ff0cc353" (UID: "483b86cb-8402-4f2d-8423-7f88ff0cc353"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4885]: I1205 20:23:56.781480 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/483b86cb-8402-4f2d-8423-7f88ff0cc353-config-data" (OuterVolumeSpecName: "config-data") pod "483b86cb-8402-4f2d-8423-7f88ff0cc353" (UID: "483b86cb-8402-4f2d-8423-7f88ff0cc353"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:23:56 crc kubenswrapper[4885]: I1205 20:23:56.825793 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483b86cb-8402-4f2d-8423-7f88ff0cc353-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4885]: I1205 20:23:56.825835 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483b86cb-8402-4f2d-8423-7f88ff0cc353-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:56 crc kubenswrapper[4885]: I1205 20:23:56.825846 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5chwc\" (UniqueName: \"kubernetes.io/projected/483b86cb-8402-4f2d-8423-7f88ff0cc353-kube-api-access-5chwc\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.182776 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92f8ad64-3f8e-462a-91ae-091750185877" path="/var/lib/kubelet/pods/92f8ad64-3f8e-462a-91ae-091750185877/volumes" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.284771 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hdlzl" event={"ID":"483b86cb-8402-4f2d-8423-7f88ff0cc353","Type":"ContainerDied","Data":"475c8eecbc576b507c7bdade42eeb2e9b081391ad85ff1d41e3dfe1167a9bd2c"} Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.284817 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="475c8eecbc576b507c7bdade42eeb2e9b081391ad85ff1d41e3dfe1167a9bd2c" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.284880 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hdlzl" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.574365 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f44b464f-s4qbq"] Dec 05 20:23:57 crc kubenswrapper[4885]: E1205 20:23:57.574767 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d27b3a6-f5ea-4e96-b5f5-22db1454767c" containerName="mariadb-database-create" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.574783 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d27b3a6-f5ea-4e96-b5f5-22db1454767c" containerName="mariadb-database-create" Dec 05 20:23:57 crc kubenswrapper[4885]: E1205 20:23:57.574800 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="483b86cb-8402-4f2d-8423-7f88ff0cc353" containerName="keystone-db-sync" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.574808 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="483b86cb-8402-4f2d-8423-7f88ff0cc353" containerName="keystone-db-sync" Dec 05 20:23:57 crc kubenswrapper[4885]: E1205 20:23:57.574833 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077192b6-b7a8-4da8-b840-8486e927178f" containerName="mariadb-account-create-update" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.574843 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="077192b6-b7a8-4da8-b840-8486e927178f" containerName="mariadb-account-create-update" Dec 05 20:23:57 crc kubenswrapper[4885]: E1205 20:23:57.574861 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd" containerName="mariadb-database-create" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.574870 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd" containerName="mariadb-database-create" Dec 05 20:23:57 crc kubenswrapper[4885]: E1205 20:23:57.574881 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcfdb1c5-4c20-42eb-9a6e-e8716d226881" containerName="mariadb-account-create-update" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.574889 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcfdb1c5-4c20-42eb-9a6e-e8716d226881" containerName="mariadb-account-create-update" Dec 05 20:23:57 crc kubenswrapper[4885]: E1205 20:23:57.574903 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92f8ad64-3f8e-462a-91ae-091750185877" containerName="dnsmasq-dns" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.574910 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f8ad64-3f8e-462a-91ae-091750185877" containerName="dnsmasq-dns" Dec 05 20:23:57 crc kubenswrapper[4885]: E1205 20:23:57.574927 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d946273-15f3-46e4-a64e-7fb5cbcce090" containerName="mariadb-database-create" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.574934 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d946273-15f3-46e4-a64e-7fb5cbcce090" containerName="mariadb-database-create" Dec 05 20:23:57 crc kubenswrapper[4885]: E1205 20:23:57.574954 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92f8ad64-3f8e-462a-91ae-091750185877" containerName="init" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.574961 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f8ad64-3f8e-462a-91ae-091750185877" containerName="init" Dec 05 20:23:57 crc kubenswrapper[4885]: E1205 20:23:57.574972 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a1b3cc-24e0-48ac-af60-1a740a0f6103" containerName="mariadb-account-create-update" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.574979 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a1b3cc-24e0-48ac-af60-1a740a0f6103" containerName="mariadb-account-create-update" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.575208 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d27b3a6-f5ea-4e96-b5f5-22db1454767c" containerName="mariadb-database-create" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.575229 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="92f8ad64-3f8e-462a-91ae-091750185877" containerName="dnsmasq-dns" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.575247 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="34a1b3cc-24e0-48ac-af60-1a740a0f6103" containerName="mariadb-account-create-update" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.575257 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d946273-15f3-46e4-a64e-7fb5cbcce090" containerName="mariadb-database-create" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.575273 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcfdb1c5-4c20-42eb-9a6e-e8716d226881" containerName="mariadb-account-create-update" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.575286 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd" containerName="mariadb-database-create" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.575303 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="077192b6-b7a8-4da8-b840-8486e927178f" containerName="mariadb-account-create-update" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.575317 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="483b86cb-8402-4f2d-8423-7f88ff0cc353" containerName="keystone-db-sync" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.576396 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.601892 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f44b464f-s4qbq"] Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.609414 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-n4jbn"] Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.610685 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.612506 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.613246 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.613584 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.613619 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-c6lcf" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.617364 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.648627 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n4jbn"] Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.747608 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-scripts\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.747666 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68mcq\" (UniqueName: \"kubernetes.io/projected/fcb7c847-ed2a-4d55-850e-00696476910b-kube-api-access-68mcq\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.747698 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-dns-svc\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.747719 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-config-data\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.747764 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-fernet-keys\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.747807 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-dns-swift-storage-0\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.747827 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-ovsdbserver-sb\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.747854 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-config\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.747904 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-combined-ca-bundle\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.747962 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-ovsdbserver-nb\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.747987 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-credential-keys\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.748045 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwvrx\" (UniqueName: \"kubernetes.io/projected/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-kube-api-access-cwvrx\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.795785 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7f696c5669-tdhw4"] Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.797589 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.799529 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-79h2m" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.800224 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.802293 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.802397 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.832307 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f696c5669-tdhw4"] Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.852351 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-ovsdbserver-nb\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.853516 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-credential-keys\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.853590 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwvrx\" (UniqueName: \"kubernetes.io/projected/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-kube-api-access-cwvrx\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.853694 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-scripts\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.853743 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68mcq\" (UniqueName: \"kubernetes.io/projected/fcb7c847-ed2a-4d55-850e-00696476910b-kube-api-access-68mcq\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.853775 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-dns-svc\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.853798 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-config-data\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.853829 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-fernet-keys\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.853878 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-ovsdbserver-sb\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.853900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-dns-swift-storage-0\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.853931 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-config\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.854016 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-combined-ca-bundle\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.855043 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-ovsdbserver-sb\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.858863 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-dns-svc\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.853466 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-ovsdbserver-nb\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.859482 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-dns-swift-storage-0\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.860082 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-config\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.860624 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-combined-ca-bundle\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.862390 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-fernet-keys\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.862453 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.864713 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.865808 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-scripts\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.868035 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.868281 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.878659 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-config-data\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.880645 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-credential-keys\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.898085 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-6jq57"] Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.899130 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.904928 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.905279 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.905925 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hbpgp" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.915694 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68mcq\" (UniqueName: \"kubernetes.io/projected/fcb7c847-ed2a-4d55-850e-00696476910b-kube-api-access-68mcq\") pod \"dnsmasq-dns-f44b464f-s4qbq\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.918707 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwvrx\" (UniqueName: \"kubernetes.io/projected/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-kube-api-access-cwvrx\") pod \"keystone-bootstrap-n4jbn\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.918775 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.936865 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.955865 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-scripts\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.955917 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99h45\" (UniqueName: \"kubernetes.io/projected/21b4bc88-9c7f-43e5-8731-69fc8942f594-kube-api-access-99h45\") pod \"horizon-7f696c5669-tdhw4\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.955948 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-config-data\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.955985 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21b4bc88-9c7f-43e5-8731-69fc8942f594-scripts\") pod \"horizon-7f696c5669-tdhw4\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.956153 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.956225 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a91533ae-4113-4680-8fb9-c0a3fa74daa8-run-httpd\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.956295 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21b4bc88-9c7f-43e5-8731-69fc8942f594-logs\") pod \"horizon-7f696c5669-tdhw4\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.956329 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21b4bc88-9c7f-43e5-8731-69fc8942f594-config-data\") pod \"horizon-7f696c5669-tdhw4\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.956376 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/21b4bc88-9c7f-43e5-8731-69fc8942f594-horizon-secret-key\") pod \"horizon-7f696c5669-tdhw4\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.956419 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tm99\" (UniqueName: \"kubernetes.io/projected/a91533ae-4113-4680-8fb9-c0a3fa74daa8-kube-api-access-6tm99\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.956491 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.956573 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a91533ae-4113-4680-8fb9-c0a3fa74daa8-log-httpd\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:57 crc kubenswrapper[4885]: I1205 20:23:57.962577 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6jq57"] Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.000354 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-w6258"] Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.001699 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.007149 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ffgjd" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.007380 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.008503 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.050328 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-w6258"] Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.062846 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-config-data\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.062921 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21b4bc88-9c7f-43e5-8731-69fc8942f594-logs\") pod \"horizon-7f696c5669-tdhw4\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.062957 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21b4bc88-9c7f-43e5-8731-69fc8942f594-config-data\") pod \"horizon-7f696c5669-tdhw4\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.062991 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/21b4bc88-9c7f-43e5-8731-69fc8942f594-horizon-secret-key\") pod \"horizon-7f696c5669-tdhw4\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.063041 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tm99\" (UniqueName: \"kubernetes.io/projected/a91533ae-4113-4680-8fb9-c0a3fa74daa8-kube-api-access-6tm99\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.063069 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzwlx\" (UniqueName: \"kubernetes.io/projected/e4a908e8-64e1-4fec-b455-66527f7efee3-kube-api-access-wzwlx\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.063119 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.063163 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-scripts\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.063208 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a91533ae-4113-4680-8fb9-c0a3fa74daa8-log-httpd\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.063256 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-db-sync-config-data\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.063295 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-scripts\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.063329 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-combined-ca-bundle\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.063353 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99h45\" (UniqueName: \"kubernetes.io/projected/21b4bc88-9c7f-43e5-8731-69fc8942f594-kube-api-access-99h45\") pod \"horizon-7f696c5669-tdhw4\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.063383 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-config-data\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.063413 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4a908e8-64e1-4fec-b455-66527f7efee3-etc-machine-id\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.063448 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21b4bc88-9c7f-43e5-8731-69fc8942f594-scripts\") pod \"horizon-7f696c5669-tdhw4\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.063479 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.063516 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a91533ae-4113-4680-8fb9-c0a3fa74daa8-run-httpd\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.073816 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a91533ae-4113-4680-8fb9-c0a3fa74daa8-log-httpd\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.074433 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21b4bc88-9c7f-43e5-8731-69fc8942f594-scripts\") pod \"horizon-7f696c5669-tdhw4\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.077539 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a91533ae-4113-4680-8fb9-c0a3fa74daa8-run-httpd\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.079051 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f44b464f-s4qbq"] Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.079602 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.081002 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21b4bc88-9c7f-43e5-8731-69fc8942f594-config-data\") pod \"horizon-7f696c5669-tdhw4\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.082388 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21b4bc88-9c7f-43e5-8731-69fc8942f594-logs\") pod \"horizon-7f696c5669-tdhw4\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.087853 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-config-data\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.088142 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-scripts\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.091456 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.091496 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.092298 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/21b4bc88-9c7f-43e5-8731-69fc8942f594-horizon-secret-key\") pod \"horizon-7f696c5669-tdhw4\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.100788 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tm99\" (UniqueName: \"kubernetes.io/projected/a91533ae-4113-4680-8fb9-c0a3fa74daa8-kube-api-access-6tm99\") pod \"ceilometer-0\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " pod="openstack/ceilometer-0" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.121903 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99h45\" (UniqueName: \"kubernetes.io/projected/21b4bc88-9c7f-43e5-8731-69fc8942f594-kube-api-access-99h45\") pod \"horizon-7f696c5669-tdhw4\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.165979 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f7cc5f48f-j8zf9"] Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.168677 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-config-data\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.168786 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzwlx\" (UniqueName: \"kubernetes.io/projected/e4a908e8-64e1-4fec-b455-66527f7efee3-kube-api-access-wzwlx\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.168860 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-combined-ca-bundle\") pod \"placement-db-sync-w6258\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.168893 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-scripts\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.168930 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lplrf\" (UniqueName: \"kubernetes.io/projected/9be03938-1d91-45a5-beba-a54b318fc799-kube-api-access-lplrf\") pod \"placement-db-sync-w6258\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.168982 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-scripts\") pod \"placement-db-sync-w6258\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.169072 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9be03938-1d91-45a5-beba-a54b318fc799-logs\") pod \"placement-db-sync-w6258\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.169171 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-db-sync-config-data\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.169224 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-config-data\") pod \"placement-db-sync-w6258\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.169261 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-combined-ca-bundle\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.169313 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4a908e8-64e1-4fec-b455-66527f7efee3-etc-machine-id\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.169474 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4a908e8-64e1-4fec-b455-66527f7efee3-etc-machine-id\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.172475 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.192732 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-combined-ca-bundle\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.196227 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-db-sync-config-data\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.196505 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-scripts\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.197054 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzwlx\" (UniqueName: \"kubernetes.io/projected/e4a908e8-64e1-4fec-b455-66527f7efee3-kube-api-access-wzwlx\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.198869 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-config-data\") pod \"cinder-db-sync-6jq57\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.201741 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f7cc5f48f-j8zf9"] Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.233903 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-55c9479c7c-4wh76"] Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.240999 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55c9479c7c-4wh76"] Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.241171 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.252943 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-v5n6g"] Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.254151 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v5n6g" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.266204 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.266393 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.266542 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sq8z4" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.266661 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-v5n6g"] Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.273886 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lplrf\" (UniqueName: \"kubernetes.io/projected/9be03938-1d91-45a5-beba-a54b318fc799-kube-api-access-lplrf\") pod \"placement-db-sync-w6258\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.273922 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-scripts\") pod \"placement-db-sync-w6258\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.273941 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9be03938-1d91-45a5-beba-a54b318fc799-logs\") pod \"placement-db-sync-w6258\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.273963 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mdrx\" (UniqueName: \"kubernetes.io/projected/91e68318-2de7-47b6-b2fd-c5932959f0ce-kube-api-access-2mdrx\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.274036 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-ovsdbserver-sb\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.274064 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-config-data\") pod \"placement-db-sync-w6258\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.274111 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-config\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.274149 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-dns-svc\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.274184 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-ovsdbserver-nb\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.274221 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-dns-swift-storage-0\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.274244 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-combined-ca-bundle\") pod \"placement-db-sync-w6258\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.274927 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9be03938-1d91-45a5-beba-a54b318fc799-logs\") pod \"placement-db-sync-w6258\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.288677 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-config-data\") pod \"placement-db-sync-w6258\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.293809 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-5szt6"] Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.294925 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5szt6" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.297197 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-scripts\") pod \"placement-db-sync-w6258\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.297841 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-combined-ca-bundle\") pod \"placement-db-sync-w6258\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.300335 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lplrf\" (UniqueName: \"kubernetes.io/projected/9be03938-1d91-45a5-beba-a54b318fc799-kube-api-access-lplrf\") pod \"placement-db-sync-w6258\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.302797 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.302928 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ksdtr" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.324603 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5szt6"] Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.362636 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377054 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-horizon-secret-key\") pod \"horizon-55c9479c7c-4wh76\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377110 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mdrx\" (UniqueName: \"kubernetes.io/projected/91e68318-2de7-47b6-b2fd-c5932959f0ce-kube-api-access-2mdrx\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377139 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88521675-6180-4a17-ba7d-6bb9eb07e7dd-combined-ca-bundle\") pod \"barbican-db-sync-5szt6\" (UID: \"88521675-6180-4a17-ba7d-6bb9eb07e7dd\") " pod="openstack/barbican-db-sync-5szt6" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377168 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t7nr\" (UniqueName: \"kubernetes.io/projected/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-kube-api-access-2t7nr\") pod \"horizon-55c9479c7c-4wh76\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377193 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-ovsdbserver-sb\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377212 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-config\") pod \"neutron-db-sync-v5n6g\" (UID: \"7c0c93a6-1c5d-49b8-b56b-92460295ec1a\") " pod="openstack/neutron-db-sync-v5n6g" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377257 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-logs\") pod \"horizon-55c9479c7c-4wh76\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377272 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-scripts\") pod \"horizon-55c9479c7c-4wh76\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377289 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92rv2\" (UniqueName: \"kubernetes.io/projected/88521675-6180-4a17-ba7d-6bb9eb07e7dd-kube-api-access-92rv2\") pod \"barbican-db-sync-5szt6\" (UID: \"88521675-6180-4a17-ba7d-6bb9eb07e7dd\") " pod="openstack/barbican-db-sync-5szt6" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377312 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-config\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377334 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-combined-ca-bundle\") pod \"neutron-db-sync-v5n6g\" (UID: \"7c0c93a6-1c5d-49b8-b56b-92460295ec1a\") " pod="openstack/neutron-db-sync-v5n6g" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377364 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-dns-svc\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377409 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/88521675-6180-4a17-ba7d-6bb9eb07e7dd-db-sync-config-data\") pod \"barbican-db-sync-5szt6\" (UID: \"88521675-6180-4a17-ba7d-6bb9eb07e7dd\") " pod="openstack/barbican-db-sync-5szt6" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377437 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-ovsdbserver-nb\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377465 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdmq\" (UniqueName: \"kubernetes.io/projected/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-kube-api-access-vgdmq\") pod \"neutron-db-sync-v5n6g\" (UID: \"7c0c93a6-1c5d-49b8-b56b-92460295ec1a\") " pod="openstack/neutron-db-sync-v5n6g" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377487 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-dns-swift-storage-0\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.377528 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-config-data\") pod \"horizon-55c9479c7c-4wh76\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.378783 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-config\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.379122 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-dns-svc\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.379424 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-ovsdbserver-nb\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.379568 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-ovsdbserver-sb\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.381125 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-dns-swift-storage-0\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.397150 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mdrx\" (UniqueName: \"kubernetes.io/projected/91e68318-2de7-47b6-b2fd-c5932959f0ce-kube-api-access-2mdrx\") pod \"dnsmasq-dns-7f7cc5f48f-j8zf9\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.417545 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.465126 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6jq57" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.479217 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t7nr\" (UniqueName: \"kubernetes.io/projected/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-kube-api-access-2t7nr\") pod \"horizon-55c9479c7c-4wh76\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.479272 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-config\") pod \"neutron-db-sync-v5n6g\" (UID: \"7c0c93a6-1c5d-49b8-b56b-92460295ec1a\") " pod="openstack/neutron-db-sync-v5n6g" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.479324 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-logs\") pod \"horizon-55c9479c7c-4wh76\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.479341 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-scripts\") pod \"horizon-55c9479c7c-4wh76\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.479356 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92rv2\" (UniqueName: \"kubernetes.io/projected/88521675-6180-4a17-ba7d-6bb9eb07e7dd-kube-api-access-92rv2\") pod \"barbican-db-sync-5szt6\" (UID: \"88521675-6180-4a17-ba7d-6bb9eb07e7dd\") " pod="openstack/barbican-db-sync-5szt6" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.479379 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-combined-ca-bundle\") pod \"neutron-db-sync-v5n6g\" (UID: \"7c0c93a6-1c5d-49b8-b56b-92460295ec1a\") " pod="openstack/neutron-db-sync-v5n6g" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.479407 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/88521675-6180-4a17-ba7d-6bb9eb07e7dd-db-sync-config-data\") pod \"barbican-db-sync-5szt6\" (UID: \"88521675-6180-4a17-ba7d-6bb9eb07e7dd\") " pod="openstack/barbican-db-sync-5szt6" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.479443 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdmq\" (UniqueName: \"kubernetes.io/projected/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-kube-api-access-vgdmq\") pod \"neutron-db-sync-v5n6g\" (UID: \"7c0c93a6-1c5d-49b8-b56b-92460295ec1a\") " pod="openstack/neutron-db-sync-v5n6g" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.479469 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-config-data\") pod \"horizon-55c9479c7c-4wh76\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.479494 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-horizon-secret-key\") pod \"horizon-55c9479c7c-4wh76\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.479521 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88521675-6180-4a17-ba7d-6bb9eb07e7dd-combined-ca-bundle\") pod \"barbican-db-sync-5szt6\" (UID: \"88521675-6180-4a17-ba7d-6bb9eb07e7dd\") " pod="openstack/barbican-db-sync-5szt6" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.480573 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-logs\") pod \"horizon-55c9479c7c-4wh76\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.480859 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-scripts\") pod \"horizon-55c9479c7c-4wh76\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.481885 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-config-data\") pod \"horizon-55c9479c7c-4wh76\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.485232 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-horizon-secret-key\") pod \"horizon-55c9479c7c-4wh76\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.486142 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-w6258" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.488588 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/88521675-6180-4a17-ba7d-6bb9eb07e7dd-db-sync-config-data\") pod \"barbican-db-sync-5szt6\" (UID: \"88521675-6180-4a17-ba7d-6bb9eb07e7dd\") " pod="openstack/barbican-db-sync-5szt6" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.488594 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-combined-ca-bundle\") pod \"neutron-db-sync-v5n6g\" (UID: \"7c0c93a6-1c5d-49b8-b56b-92460295ec1a\") " pod="openstack/neutron-db-sync-v5n6g" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.492507 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88521675-6180-4a17-ba7d-6bb9eb07e7dd-combined-ca-bundle\") pod \"barbican-db-sync-5szt6\" (UID: \"88521675-6180-4a17-ba7d-6bb9eb07e7dd\") " pod="openstack/barbican-db-sync-5szt6" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.498803 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-config\") pod \"neutron-db-sync-v5n6g\" (UID: \"7c0c93a6-1c5d-49b8-b56b-92460295ec1a\") " pod="openstack/neutron-db-sync-v5n6g" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.499366 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgdmq\" (UniqueName: \"kubernetes.io/projected/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-kube-api-access-vgdmq\") pod \"neutron-db-sync-v5n6g\" (UID: \"7c0c93a6-1c5d-49b8-b56b-92460295ec1a\") " pod="openstack/neutron-db-sync-v5n6g" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.505604 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92rv2\" (UniqueName: \"kubernetes.io/projected/88521675-6180-4a17-ba7d-6bb9eb07e7dd-kube-api-access-92rv2\") pod \"barbican-db-sync-5szt6\" (UID: \"88521675-6180-4a17-ba7d-6bb9eb07e7dd\") " pod="openstack/barbican-db-sync-5szt6" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.508293 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.509076 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t7nr\" (UniqueName: \"kubernetes.io/projected/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-kube-api-access-2t7nr\") pod \"horizon-55c9479c7c-4wh76\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.555712 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.612594 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v5n6g" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.622005 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5szt6" Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.668497 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n4jbn"] Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.725779 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:23:58 crc kubenswrapper[4885]: I1205 20:23:58.747814 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f44b464f-s4qbq"] Dec 05 20:23:59 crc kubenswrapper[4885]: I1205 20:23:59.062198 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f696c5669-tdhw4"] Dec 05 20:23:59 crc kubenswrapper[4885]: W1205 20:23:59.192894 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4a908e8_64e1_4fec_b455_66527f7efee3.slice/crio-62d5152b38b24efd465c19778bdbdfe92568643ed2370eeb4ffcd6ebed6a4214 WatchSource:0}: Error finding container 62d5152b38b24efd465c19778bdbdfe92568643ed2370eeb4ffcd6ebed6a4214: Status 404 returned error can't find the container with id 62d5152b38b24efd465c19778bdbdfe92568643ed2370eeb4ffcd6ebed6a4214 Dec 05 20:23:59 crc kubenswrapper[4885]: I1205 20:23:59.197221 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6jq57"] Dec 05 20:23:59 crc kubenswrapper[4885]: I1205 20:23:59.286208 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5szt6"] Dec 05 20:23:59 crc kubenswrapper[4885]: W1205 20:23:59.290677 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9be03938_1d91_45a5_beba_a54b318fc799.slice/crio-e2ae51cfd01e8b75e47b0396c35ea777f5c06fb3135919535196d0f715de7306 WatchSource:0}: Error finding container e2ae51cfd01e8b75e47b0396c35ea777f5c06fb3135919535196d0f715de7306: Status 404 returned error can't find the container with id e2ae51cfd01e8b75e47b0396c35ea777f5c06fb3135919535196d0f715de7306 Dec 05 20:23:59 crc kubenswrapper[4885]: W1205 20:23:59.295381 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88521675_6180_4a17_ba7d_6bb9eb07e7dd.slice/crio-c4bcfbf3c09021cd5f4710dcdb41d4c360839690f780a1b9c2fe5a8f0f9658c5 WatchSource:0}: Error finding container c4bcfbf3c09021cd5f4710dcdb41d4c360839690f780a1b9c2fe5a8f0f9658c5: Status 404 returned error can't find the container with id c4bcfbf3c09021cd5f4710dcdb41d4c360839690f780a1b9c2fe5a8f0f9658c5 Dec 05 20:23:59 crc kubenswrapper[4885]: I1205 20:23:59.306121 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-w6258"] Dec 05 20:23:59 crc kubenswrapper[4885]: W1205 20:23:59.311513 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91e68318_2de7_47b6_b2fd_c5932959f0ce.slice/crio-c9325c3871d24c981c65870306c84307a66263d501cf0b128703cfdb722588a5 WatchSource:0}: Error finding container c9325c3871d24c981c65870306c84307a66263d501cf0b128703cfdb722588a5: Status 404 returned error can't find the container with id c9325c3871d24c981c65870306c84307a66263d501cf0b128703cfdb722588a5 Dec 05 20:23:59 crc kubenswrapper[4885]: I1205 20:23:59.326072 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55c9479c7c-4wh76"] Dec 05 20:23:59 crc kubenswrapper[4885]: I1205 20:23:59.348039 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f7cc5f48f-j8zf9"] Dec 05 20:23:59 crc kubenswrapper[4885]: I1205 20:23:59.356267 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-w6258" event={"ID":"9be03938-1d91-45a5-beba-a54b318fc799","Type":"ContainerStarted","Data":"e2ae51cfd01e8b75e47b0396c35ea777f5c06fb3135919535196d0f715de7306"} Dec 05 20:23:59 crc kubenswrapper[4885]: I1205 20:23:59.366277 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n4jbn" event={"ID":"54ec1d5a-8c8e-434e-b45b-64e58339a6f7","Type":"ContainerStarted","Data":"49600fea487c33c17c34ac3298c5a4108a2f43b6342529c6eb84e6e4e331716b"} Dec 05 20:23:59 crc kubenswrapper[4885]: I1205 20:23:59.372110 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55c9479c7c-4wh76" event={"ID":"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21","Type":"ContainerStarted","Data":"9c3fa782a02ea1993374fc085dbd9d1880f9e12fb1ee15eed0629aaa7d3e7d5d"} Dec 05 20:23:59 crc kubenswrapper[4885]: I1205 20:23:59.372966 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5szt6" event={"ID":"88521675-6180-4a17-ba7d-6bb9eb07e7dd","Type":"ContainerStarted","Data":"c4bcfbf3c09021cd5f4710dcdb41d4c360839690f780a1b9c2fe5a8f0f9658c5"} Dec 05 20:23:59 crc kubenswrapper[4885]: I1205 20:23:59.373905 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a91533ae-4113-4680-8fb9-c0a3fa74daa8","Type":"ContainerStarted","Data":"ee8b99d58db7f40769260f0ce89044e6f0dc08f15156fbf6b64831c02b2c6be8"} Dec 05 20:23:59 crc kubenswrapper[4885]: I1205 20:23:59.385110 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6jq57" event={"ID":"e4a908e8-64e1-4fec-b455-66527f7efee3","Type":"ContainerStarted","Data":"62d5152b38b24efd465c19778bdbdfe92568643ed2370eeb4ffcd6ebed6a4214"} Dec 05 20:23:59 crc kubenswrapper[4885]: I1205 20:23:59.389095 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f44b464f-s4qbq" event={"ID":"fcb7c847-ed2a-4d55-850e-00696476910b","Type":"ContainerStarted","Data":"9574620a88e13eb058546ef92626d077bc47707770ec91bf66f9fb1ff356066a"} Dec 05 20:23:59 crc kubenswrapper[4885]: I1205 20:23:59.413148 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f696c5669-tdhw4" event={"ID":"21b4bc88-9c7f-43e5-8731-69fc8942f594","Type":"ContainerStarted","Data":"210b03812f1f56e3c61b30994eedc56d6e834829d317ffe2f5e04ce345b591bc"} Dec 05 20:23:59 crc kubenswrapper[4885]: I1205 20:23:59.503298 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-v5n6g"] Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.436672 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n4jbn" event={"ID":"54ec1d5a-8c8e-434e-b45b-64e58339a6f7","Type":"ContainerStarted","Data":"d13a3eeadcd25a0137b9bb8825da963d72f583fe23066153e8266055f0b0ce9e"} Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.448158 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v5n6g" event={"ID":"7c0c93a6-1c5d-49b8-b56b-92460295ec1a","Type":"ContainerStarted","Data":"3d283843e63c03be7d9bef8cdada2311901b50fe69cbc53fa6e187d1a092694b"} Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.448206 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v5n6g" event={"ID":"7c0c93a6-1c5d-49b8-b56b-92460295ec1a","Type":"ContainerStarted","Data":"9f94472bbe626b2a31e67cd5e7a0ecbe51d563dadda1eda33a95cc281cf1629a"} Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.457920 4885 generic.go:334] "Generic (PLEG): container finished" podID="fcb7c847-ed2a-4d55-850e-00696476910b" containerID="02991b505640e173d91121c81d90a15f84928f7b2ca50ac8257b99eee7c005a6" exitCode=0 Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.458038 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f44b464f-s4qbq" event={"ID":"fcb7c847-ed2a-4d55-850e-00696476910b","Type":"ContainerDied","Data":"02991b505640e173d91121c81d90a15f84928f7b2ca50ac8257b99eee7c005a6"} Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.466378 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-n4jbn" podStartSLOduration=3.46636209 podStartE2EDuration="3.46636209s" podCreationTimestamp="2025-12-05 20:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:24:00.46570733 +0000 UTC m=+1105.762522991" watchObservedRunningTime="2025-12-05 20:24:00.46636209 +0000 UTC m=+1105.763177751" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.468940 4885 generic.go:334] "Generic (PLEG): container finished" podID="91e68318-2de7-47b6-b2fd-c5932959f0ce" containerID="fd51330d85ce00dfe3d175aa4ae4469ade1ed81167092ac5336d226304e5bc22" exitCode=0 Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.469047 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" event={"ID":"91e68318-2de7-47b6-b2fd-c5932959f0ce","Type":"ContainerDied","Data":"fd51330d85ce00dfe3d175aa4ae4469ade1ed81167092ac5336d226304e5bc22"} Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.469079 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" event={"ID":"91e68318-2de7-47b6-b2fd-c5932959f0ce","Type":"ContainerStarted","Data":"c9325c3871d24c981c65870306c84307a66263d501cf0b128703cfdb722588a5"} Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.583261 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-v5n6g" podStartSLOduration=2.583241893 podStartE2EDuration="2.583241893s" podCreationTimestamp="2025-12-05 20:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:24:00.551854478 +0000 UTC m=+1105.848670139" watchObservedRunningTime="2025-12-05 20:24:00.583241893 +0000 UTC m=+1105.880057554" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.638056 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f696c5669-tdhw4"] Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.682286 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-59cc747f79-h5ns4"] Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.701923 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.706224 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59cc747f79-h5ns4"] Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.736611 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.849169 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfd8x\" (UniqueName: \"kubernetes.io/projected/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-kube-api-access-mfd8x\") pod \"horizon-59cc747f79-h5ns4\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.851308 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-logs\") pod \"horizon-59cc747f79-h5ns4\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.851351 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-scripts\") pod \"horizon-59cc747f79-h5ns4\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.851385 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-horizon-secret-key\") pod \"horizon-59cc747f79-h5ns4\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.851436 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-config-data\") pod \"horizon-59cc747f79-h5ns4\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.952751 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-scripts\") pod \"horizon-59cc747f79-h5ns4\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.952819 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-horizon-secret-key\") pod \"horizon-59cc747f79-h5ns4\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.952863 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-config-data\") pod \"horizon-59cc747f79-h5ns4\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.952980 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfd8x\" (UniqueName: \"kubernetes.io/projected/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-kube-api-access-mfd8x\") pod \"horizon-59cc747f79-h5ns4\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.952998 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-logs\") pod \"horizon-59cc747f79-h5ns4\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.953511 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-logs\") pod \"horizon-59cc747f79-h5ns4\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.953818 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-scripts\") pod \"horizon-59cc747f79-h5ns4\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.955151 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-config-data\") pod \"horizon-59cc747f79-h5ns4\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.964297 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-horizon-secret-key\") pod \"horizon-59cc747f79-h5ns4\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:00 crc kubenswrapper[4885]: I1205 20:24:00.974456 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfd8x\" (UniqueName: \"kubernetes.io/projected/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-kube-api-access-mfd8x\") pod \"horizon-59cc747f79-h5ns4\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:01 crc kubenswrapper[4885]: I1205 20:24:01.048193 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:01 crc kubenswrapper[4885]: I1205 20:24:01.850787 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:24:01 crc kubenswrapper[4885]: I1205 20:24:01.982747 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-ovsdbserver-sb\") pod \"fcb7c847-ed2a-4d55-850e-00696476910b\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " Dec 05 20:24:01 crc kubenswrapper[4885]: I1205 20:24:01.982825 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-dns-swift-storage-0\") pod \"fcb7c847-ed2a-4d55-850e-00696476910b\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " Dec 05 20:24:01 crc kubenswrapper[4885]: I1205 20:24:01.982876 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-config\") pod \"fcb7c847-ed2a-4d55-850e-00696476910b\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " Dec 05 20:24:01 crc kubenswrapper[4885]: I1205 20:24:01.982907 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-dns-svc\") pod \"fcb7c847-ed2a-4d55-850e-00696476910b\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " Dec 05 20:24:01 crc kubenswrapper[4885]: I1205 20:24:01.982957 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68mcq\" (UniqueName: \"kubernetes.io/projected/fcb7c847-ed2a-4d55-850e-00696476910b-kube-api-access-68mcq\") pod \"fcb7c847-ed2a-4d55-850e-00696476910b\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " Dec 05 20:24:01 crc kubenswrapper[4885]: I1205 20:24:01.983053 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-ovsdbserver-nb\") pod \"fcb7c847-ed2a-4d55-850e-00696476910b\" (UID: \"fcb7c847-ed2a-4d55-850e-00696476910b\") " Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.003766 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb7c847-ed2a-4d55-850e-00696476910b-kube-api-access-68mcq" (OuterVolumeSpecName: "kube-api-access-68mcq") pod "fcb7c847-ed2a-4d55-850e-00696476910b" (UID: "fcb7c847-ed2a-4d55-850e-00696476910b"). InnerVolumeSpecName "kube-api-access-68mcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.014311 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fcb7c847-ed2a-4d55-850e-00696476910b" (UID: "fcb7c847-ed2a-4d55-850e-00696476910b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.040723 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-config" (OuterVolumeSpecName: "config") pod "fcb7c847-ed2a-4d55-850e-00696476910b" (UID: "fcb7c847-ed2a-4d55-850e-00696476910b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.079205 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fcb7c847-ed2a-4d55-850e-00696476910b" (UID: "fcb7c847-ed2a-4d55-850e-00696476910b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.084834 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.084873 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.084882 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68mcq\" (UniqueName: \"kubernetes.io/projected/fcb7c847-ed2a-4d55-850e-00696476910b-kube-api-access-68mcq\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.084894 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.088934 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fcb7c847-ed2a-4d55-850e-00696476910b" (UID: "fcb7c847-ed2a-4d55-850e-00696476910b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.091590 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fcb7c847-ed2a-4d55-850e-00696476910b" (UID: "fcb7c847-ed2a-4d55-850e-00696476910b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.186135 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.186168 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcb7c847-ed2a-4d55-850e-00696476910b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.499865 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f44b464f-s4qbq" event={"ID":"fcb7c847-ed2a-4d55-850e-00696476910b","Type":"ContainerDied","Data":"9574620a88e13eb058546ef92626d077bc47707770ec91bf66f9fb1ff356066a"} Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.499944 4885 scope.go:117] "RemoveContainer" containerID="02991b505640e173d91121c81d90a15f84928f7b2ca50ac8257b99eee7c005a6" Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.500137 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f44b464f-s4qbq" Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.614175 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f44b464f-s4qbq"] Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.619275 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f44b464f-s4qbq"] Dec 05 20:24:02 crc kubenswrapper[4885]: I1205 20:24:02.689824 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59cc747f79-h5ns4"] Dec 05 20:24:03 crc kubenswrapper[4885]: I1205 20:24:03.190119 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcb7c847-ed2a-4d55-850e-00696476910b" path="/var/lib/kubelet/pods/fcb7c847-ed2a-4d55-850e-00696476910b/volumes" Dec 05 20:24:03 crc kubenswrapper[4885]: I1205 20:24:03.514859 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" event={"ID":"91e68318-2de7-47b6-b2fd-c5932959f0ce","Type":"ContainerStarted","Data":"6c984a17f8fe658b2b3580baa9a6eb11df2c0e26fc5c8155764c4378e2119e62"} Dec 05 20:24:03 crc kubenswrapper[4885]: I1205 20:24:03.515952 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:24:03 crc kubenswrapper[4885]: I1205 20:24:03.520012 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59cc747f79-h5ns4" event={"ID":"775ea4fb-9967-4ddb-bfc2-874afb08f0c1","Type":"ContainerStarted","Data":"3649817de0f7e609437dc321d32e479869bbb15a8df12d2275104d0f01897de2"} Dec 05 20:24:03 crc kubenswrapper[4885]: I1205 20:24:03.551373 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" podStartSLOduration=6.551346526 podStartE2EDuration="6.551346526s" podCreationTimestamp="2025-12-05 20:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:24:03.538487622 +0000 UTC m=+1108.835303293" watchObservedRunningTime="2025-12-05 20:24:03.551346526 +0000 UTC m=+1108.848162187" Dec 05 20:24:05 crc kubenswrapper[4885]: I1205 20:24:05.568916 4885 generic.go:334] "Generic (PLEG): container finished" podID="54ec1d5a-8c8e-434e-b45b-64e58339a6f7" containerID="d13a3eeadcd25a0137b9bb8825da963d72f583fe23066153e8266055f0b0ce9e" exitCode=0 Dec 05 20:24:05 crc kubenswrapper[4885]: I1205 20:24:05.570284 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n4jbn" event={"ID":"54ec1d5a-8c8e-434e-b45b-64e58339a6f7","Type":"ContainerDied","Data":"d13a3eeadcd25a0137b9bb8825da963d72f583fe23066153e8266055f0b0ce9e"} Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.510860 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55c9479c7c-4wh76"] Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.550780 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7ddb869454-vvfd9"] Dec 05 20:24:06 crc kubenswrapper[4885]: E1205 20:24:06.551381 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb7c847-ed2a-4d55-850e-00696476910b" containerName="init" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.551404 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb7c847-ed2a-4d55-850e-00696476910b" containerName="init" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.551602 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb7c847-ed2a-4d55-850e-00696476910b" containerName="init" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.552646 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.554754 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-horizon-secret-key\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.554827 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-scripts\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.554861 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-horizon-tls-certs\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.554898 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-config-data\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.554926 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fcc2\" (UniqueName: \"kubernetes.io/projected/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-kube-api-access-7fcc2\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.554985 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-logs\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.555038 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.555047 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-combined-ca-bundle\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.576405 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7ddb869454-vvfd9"] Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.654079 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59cc747f79-h5ns4"] Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.657694 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-horizon-secret-key\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.657772 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-scripts\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.657807 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-horizon-tls-certs\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.657836 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-config-data\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.657863 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fcc2\" (UniqueName: \"kubernetes.io/projected/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-kube-api-access-7fcc2\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.657909 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-logs\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.657947 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-combined-ca-bundle\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.664368 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-config-data\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.665459 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-combined-ca-bundle\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.665664 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-logs\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.665849 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-scripts\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.667243 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-horizon-tls-certs\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.672352 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7d9999949d-c22ch"] Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.674369 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.683597 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-horizon-secret-key\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.706939 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d9999949d-c22ch"] Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.715647 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fcc2\" (UniqueName: \"kubernetes.io/projected/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-kube-api-access-7fcc2\") pod \"horizon-7ddb869454-vvfd9\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.870502 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcgpt\" (UniqueName: \"kubernetes.io/projected/d0f84b71-1907-4f71-833d-1e5561a4f0f8-kube-api-access-vcgpt\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.871126 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f84b71-1907-4f71-833d-1e5561a4f0f8-combined-ca-bundle\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.871158 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f84b71-1907-4f71-833d-1e5561a4f0f8-horizon-tls-certs\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.871212 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0f84b71-1907-4f71-833d-1e5561a4f0f8-config-data\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.871251 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0f84b71-1907-4f71-833d-1e5561a4f0f8-scripts\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.871284 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0f84b71-1907-4f71-833d-1e5561a4f0f8-logs\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.871460 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d0f84b71-1907-4f71-833d-1e5561a4f0f8-horizon-secret-key\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.873216 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.973176 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0f84b71-1907-4f71-833d-1e5561a4f0f8-logs\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.973269 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d0f84b71-1907-4f71-833d-1e5561a4f0f8-horizon-secret-key\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.973305 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcgpt\" (UniqueName: \"kubernetes.io/projected/d0f84b71-1907-4f71-833d-1e5561a4f0f8-kube-api-access-vcgpt\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.973345 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f84b71-1907-4f71-833d-1e5561a4f0f8-combined-ca-bundle\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.973369 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f84b71-1907-4f71-833d-1e5561a4f0f8-horizon-tls-certs\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.973407 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0f84b71-1907-4f71-833d-1e5561a4f0f8-config-data\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.973429 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0f84b71-1907-4f71-833d-1e5561a4f0f8-scripts\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.973733 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0f84b71-1907-4f71-833d-1e5561a4f0f8-logs\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.974239 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0f84b71-1907-4f71-833d-1e5561a4f0f8-scripts\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.975071 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0f84b71-1907-4f71-833d-1e5561a4f0f8-config-data\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.977978 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f84b71-1907-4f71-833d-1e5561a4f0f8-combined-ca-bundle\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.978502 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d0f84b71-1907-4f71-833d-1e5561a4f0f8-horizon-secret-key\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.978836 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f84b71-1907-4f71-833d-1e5561a4f0f8-horizon-tls-certs\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:06 crc kubenswrapper[4885]: I1205 20:24:06.999590 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcgpt\" (UniqueName: \"kubernetes.io/projected/d0f84b71-1907-4f71-833d-1e5561a4f0f8-kube-api-access-vcgpt\") pod \"horizon-7d9999949d-c22ch\" (UID: \"d0f84b71-1907-4f71-833d-1e5561a4f0f8\") " pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:07 crc kubenswrapper[4885]: I1205 20:24:07.090643 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:08 crc kubenswrapper[4885]: I1205 20:24:08.510428 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:24:08 crc kubenswrapper[4885]: I1205 20:24:08.627757 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5cc849d9-vffwb"] Dec 05 20:24:08 crc kubenswrapper[4885]: I1205 20:24:08.629135 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" podUID="e4751187-c98f-4fb5-aba4-63b0f8715b69" containerName="dnsmasq-dns" containerID="cri-o://6b716fb752c626d30def76af4bab6711fd693e2a61883d08d65fc1e7b87a2df6" gracePeriod=10 Dec 05 20:24:09 crc kubenswrapper[4885]: I1205 20:24:09.540864 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" podUID="e4751187-c98f-4fb5-aba4-63b0f8715b69" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Dec 05 20:24:09 crc kubenswrapper[4885]: I1205 20:24:09.632027 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" event={"ID":"e4751187-c98f-4fb5-aba4-63b0f8715b69","Type":"ContainerDied","Data":"6b716fb752c626d30def76af4bab6711fd693e2a61883d08d65fc1e7b87a2df6"} Dec 05 20:24:09 crc kubenswrapper[4885]: I1205 20:24:09.633769 4885 generic.go:334] "Generic (PLEG): container finished" podID="e4751187-c98f-4fb5-aba4-63b0f8715b69" containerID="6b716fb752c626d30def76af4bab6711fd693e2a61883d08d65fc1e7b87a2df6" exitCode=0 Dec 05 20:24:14 crc kubenswrapper[4885]: I1205 20:24:14.542785 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" podUID="e4751187-c98f-4fb5-aba4-63b0f8715b69" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Dec 05 20:24:15 crc kubenswrapper[4885]: E1205 20:24:15.404179 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:4f4b273dd4c6ead9bf640fb985d101a7c9adba388968fb1d71fbb08b0510eb9f" Dec 05 20:24:15 crc kubenswrapper[4885]: E1205 20:24:15.404569 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:4f4b273dd4c6ead9bf640fb985d101a7c9adba388968fb1d71fbb08b0510eb9f,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n66bh564h9dh54fh699hbfh5d4h574h656h549h5d6h59ch5b9hb5h55bh9hd9h589h5fbh66chfdh59hcch5h9chddhbch599h5bfh67bh5c9h5fbq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2t7nr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-55c9479c7c-4wh76_openstack(cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:24:15 crc kubenswrapper[4885]: E1205 20:24:15.406602 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:4f4b273dd4c6ead9bf640fb985d101a7c9adba388968fb1d71fbb08b0510eb9f\\\"\"]" pod="openstack/horizon-55c9479c7c-4wh76" podUID="cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21" Dec 05 20:24:15 crc kubenswrapper[4885]: E1205 20:24:15.406769 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:4f4b273dd4c6ead9bf640fb985d101a7c9adba388968fb1d71fbb08b0510eb9f" Dec 05 20:24:15 crc kubenswrapper[4885]: E1205 20:24:15.406842 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:4f4b273dd4c6ead9bf640fb985d101a7c9adba388968fb1d71fbb08b0510eb9f,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n547hdh57dh5d4h97h5b6h56h556h699h66ch66h555h6ch5cdh67ch5fchc7hc4hd5h579h659hc6h57fh5dch575hd4h669h86h58ch5b7h5dbh657q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-99h45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7f696c5669-tdhw4_openstack(21b4bc88-9c7f-43e5-8731-69fc8942f594): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:24:15 crc kubenswrapper[4885]: E1205 20:24:15.410375 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:4f4b273dd4c6ead9bf640fb985d101a7c9adba388968fb1d71fbb08b0510eb9f\\\"\"]" pod="openstack/horizon-7f696c5669-tdhw4" podUID="21b4bc88-9c7f-43e5-8731-69fc8942f594" Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.468343 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.533058 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-credential-keys\") pod \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.533163 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-combined-ca-bundle\") pod \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.533393 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-scripts\") pod \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.533445 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-fernet-keys\") pod \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.533484 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-config-data\") pod \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.533519 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwvrx\" (UniqueName: \"kubernetes.io/projected/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-kube-api-access-cwvrx\") pod \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\" (UID: \"54ec1d5a-8c8e-434e-b45b-64e58339a6f7\") " Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.540363 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-kube-api-access-cwvrx" (OuterVolumeSpecName: "kube-api-access-cwvrx") pod "54ec1d5a-8c8e-434e-b45b-64e58339a6f7" (UID: "54ec1d5a-8c8e-434e-b45b-64e58339a6f7"). InnerVolumeSpecName "kube-api-access-cwvrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.540657 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-scripts" (OuterVolumeSpecName: "scripts") pod "54ec1d5a-8c8e-434e-b45b-64e58339a6f7" (UID: "54ec1d5a-8c8e-434e-b45b-64e58339a6f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.544523 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "54ec1d5a-8c8e-434e-b45b-64e58339a6f7" (UID: "54ec1d5a-8c8e-434e-b45b-64e58339a6f7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.547088 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "54ec1d5a-8c8e-434e-b45b-64e58339a6f7" (UID: "54ec1d5a-8c8e-434e-b45b-64e58339a6f7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.570249 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54ec1d5a-8c8e-434e-b45b-64e58339a6f7" (UID: "54ec1d5a-8c8e-434e-b45b-64e58339a6f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.587370 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-config-data" (OuterVolumeSpecName: "config-data") pod "54ec1d5a-8c8e-434e-b45b-64e58339a6f7" (UID: "54ec1d5a-8c8e-434e-b45b-64e58339a6f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.635880 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.636012 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.636099 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.636180 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwvrx\" (UniqueName: \"kubernetes.io/projected/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-kube-api-access-cwvrx\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.636339 4885 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.636407 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ec1d5a-8c8e-434e-b45b-64e58339a6f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.697004 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n4jbn" Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.697290 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n4jbn" event={"ID":"54ec1d5a-8c8e-434e-b45b-64e58339a6f7","Type":"ContainerDied","Data":"49600fea487c33c17c34ac3298c5a4108a2f43b6342529c6eb84e6e4e331716b"} Dec 05 20:24:15 crc kubenswrapper[4885]: I1205 20:24:15.700383 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49600fea487c33c17c34ac3298c5a4108a2f43b6342529c6eb84e6e4e331716b" Dec 05 20:24:16 crc kubenswrapper[4885]: E1205 20:24:16.024510 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:43a24796dabde68270dbfefa107205e173fdd6a0dc701502858cadbede69da31" Dec 05 20:24:16 crc kubenswrapper[4885]: E1205 20:24:16.024685 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:43a24796dabde68270dbfefa107205e173fdd6a0dc701502858cadbede69da31,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5f5h586h5fbh574h55fh99h5d9h677h57bhdch689h66bh9h67dh5ddh86h544h579h677h647h658h659h596h5b6h5cch5dh5fh565h5fch5b5hcdhd6q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6tm99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(a91533ae-4113-4680-8fb9-c0a3fa74daa8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.626802 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-n4jbn"] Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.633289 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.633346 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.636160 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-n4jbn"] Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.653350 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-cct8h"] Dec 05 20:24:16 crc kubenswrapper[4885]: E1205 20:24:16.653706 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ec1d5a-8c8e-434e-b45b-64e58339a6f7" containerName="keystone-bootstrap" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.653724 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ec1d5a-8c8e-434e-b45b-64e58339a6f7" containerName="keystone-bootstrap" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.653911 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ec1d5a-8c8e-434e-b45b-64e58339a6f7" containerName="keystone-bootstrap" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.655758 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.658316 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.658552 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.658683 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.658903 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-c6lcf" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.659105 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.667730 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cct8h"] Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.757824 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-fernet-keys\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.757906 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-credential-keys\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.757933 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7zvj\" (UniqueName: \"kubernetes.io/projected/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-kube-api-access-c7zvj\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.757998 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-config-data\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.758040 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-scripts\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.758061 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-combined-ca-bundle\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.860596 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-config-data\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.860735 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-scripts\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.860835 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-combined-ca-bundle\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.860989 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-fernet-keys\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.861085 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-credential-keys\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.861110 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7zvj\" (UniqueName: \"kubernetes.io/projected/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-kube-api-access-c7zvj\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.866610 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-fernet-keys\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.866911 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-credential-keys\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.867397 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-config-data\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.867472 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-scripts\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.868217 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-combined-ca-bundle\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.878753 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7zvj\" (UniqueName: \"kubernetes.io/projected/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-kube-api-access-c7zvj\") pod \"keystone-bootstrap-cct8h\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:16 crc kubenswrapper[4885]: I1205 20:24:16.978425 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:17 crc kubenswrapper[4885]: I1205 20:24:17.193604 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54ec1d5a-8c8e-434e-b45b-64e58339a6f7" path="/var/lib/kubelet/pods/54ec1d5a-8c8e-434e-b45b-64e58339a6f7/volumes" Dec 05 20:24:17 crc kubenswrapper[4885]: E1205 20:24:17.709924 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:f24234939afca841e46ea4d17bec959b63705ab0e75476465e777d44905c5f1b" Dec 05 20:24:17 crc kubenswrapper[4885]: E1205 20:24:17.710379 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:f24234939afca841e46ea4d17bec959b63705ab0e75476465e777d44905c5f1b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lplrf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-w6258_openstack(9be03938-1d91-45a5-beba-a54b318fc799): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:24:17 crc kubenswrapper[4885]: E1205 20:24:17.711772 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-w6258" podUID="9be03938-1d91-45a5-beba-a54b318fc799" Dec 05 20:24:18 crc kubenswrapper[4885]: E1205 20:24:18.727486 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:f24234939afca841e46ea4d17bec959b63705ab0e75476465e777d44905c5f1b\\\"\"" pod="openstack/placement-db-sync-w6258" podUID="9be03938-1d91-45a5-beba-a54b318fc799" Dec 05 20:24:21 crc kubenswrapper[4885]: I1205 20:24:21.753470 4885 generic.go:334] "Generic (PLEG): container finished" podID="7c0c93a6-1c5d-49b8-b56b-92460295ec1a" containerID="3d283843e63c03be7d9bef8cdada2311901b50fe69cbc53fa6e187d1a092694b" exitCode=0 Dec 05 20:24:21 crc kubenswrapper[4885]: I1205 20:24:21.753567 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v5n6g" event={"ID":"7c0c93a6-1c5d-49b8-b56b-92460295ec1a","Type":"ContainerDied","Data":"3d283843e63c03be7d9bef8cdada2311901b50fe69cbc53fa6e187d1a092694b"} Dec 05 20:24:24 crc kubenswrapper[4885]: I1205 20:24:24.540996 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" podUID="e4751187-c98f-4fb5-aba4-63b0f8715b69" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Dec 05 20:24:24 crc kubenswrapper[4885]: I1205 20:24:24.541814 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:24:29 crc kubenswrapper[4885]: I1205 20:24:29.541513 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" podUID="e4751187-c98f-4fb5-aba4-63b0f8715b69" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Dec 05 20:24:30 crc kubenswrapper[4885]: E1205 20:24:30.393258 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:82006b9c64d4c5f80483cda262d960ce6be4813665158ef1a53ea7734bbe431f" Dec 05 20:24:30 crc kubenswrapper[4885]: E1205 20:24:30.393874 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:82006b9c64d4c5f80483cda262d960ce6be4813665158ef1a53ea7734bbe431f,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-92rv2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-5szt6_openstack(88521675-6180-4a17-ba7d-6bb9eb07e7dd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:24:30 crc kubenswrapper[4885]: E1205 20:24:30.395401 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-5szt6" podUID="88521675-6180-4a17-ba7d-6bb9eb07e7dd" Dec 05 20:24:30 crc kubenswrapper[4885]: E1205 20:24:30.397372 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:4f4b273dd4c6ead9bf640fb985d101a7c9adba388968fb1d71fbb08b0510eb9f" Dec 05 20:24:30 crc kubenswrapper[4885]: E1205 20:24:30.397520 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:4f4b273dd4c6ead9bf640fb985d101a7c9adba388968fb1d71fbb08b0510eb9f,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n64fh644hdh586hc6h54fh5cfh645h57dhb6h599hbchf9h589h587h67hffh566h68chb5h5bdh59ch666h66fh5bh674h689h685h5cdh665h688h59bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mfd8x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-59cc747f79-h5ns4_openstack(775ea4fb-9967-4ddb-bfc2-874afb08f0c1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:24:30 crc kubenswrapper[4885]: E1205 20:24:30.401495 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:4f4b273dd4c6ead9bf640fb985d101a7c9adba388968fb1d71fbb08b0510eb9f\\\"\"]" pod="openstack/horizon-59cc747f79-h5ns4" podUID="775ea4fb-9967-4ddb-bfc2-874afb08f0c1" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.493747 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.504377 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-logs\") pod \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.504415 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-scripts\") pod \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.504457 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-config-data\") pod \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.504505 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-horizon-secret-key\") pod \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.504551 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t7nr\" (UniqueName: \"kubernetes.io/projected/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-kube-api-access-2t7nr\") pod \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\" (UID: \"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.504642 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.505544 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-scripts" (OuterVolumeSpecName: "scripts") pod "cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21" (UID: "cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.505552 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-logs" (OuterVolumeSpecName: "logs") pod "cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21" (UID: "cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.505747 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-config-data" (OuterVolumeSpecName: "config-data") pod "cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21" (UID: "cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.514622 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-kube-api-access-2t7nr" (OuterVolumeSpecName: "kube-api-access-2t7nr") pod "cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21" (UID: "cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21"). InnerVolumeSpecName "kube-api-access-2t7nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.514716 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21" (UID: "cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.572340 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.579404 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v5n6g" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.605609 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21b4bc88-9c7f-43e5-8731-69fc8942f594-config-data\") pod \"21b4bc88-9c7f-43e5-8731-69fc8942f594\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.605773 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-dns-svc\") pod \"e4751187-c98f-4fb5-aba4-63b0f8715b69\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.606203 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b4bc88-9c7f-43e5-8731-69fc8942f594-config-data" (OuterVolumeSpecName: "config-data") pod "21b4bc88-9c7f-43e5-8731-69fc8942f594" (UID: "21b4bc88-9c7f-43e5-8731-69fc8942f594"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.606279 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgdmq\" (UniqueName: \"kubernetes.io/projected/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-kube-api-access-vgdmq\") pod \"7c0c93a6-1c5d-49b8-b56b-92460295ec1a\" (UID: \"7c0c93a6-1c5d-49b8-b56b-92460295ec1a\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.606321 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-ovsdbserver-sb\") pod \"e4751187-c98f-4fb5-aba4-63b0f8715b69\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.606342 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99h45\" (UniqueName: \"kubernetes.io/projected/21b4bc88-9c7f-43e5-8731-69fc8942f594-kube-api-access-99h45\") pod \"21b4bc88-9c7f-43e5-8731-69fc8942f594\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.606385 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-dns-swift-storage-0\") pod \"e4751187-c98f-4fb5-aba4-63b0f8715b69\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.606412 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/21b4bc88-9c7f-43e5-8731-69fc8942f594-horizon-secret-key\") pod \"21b4bc88-9c7f-43e5-8731-69fc8942f594\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.606489 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-ovsdbserver-nb\") pod \"e4751187-c98f-4fb5-aba4-63b0f8715b69\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.606523 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21b4bc88-9c7f-43e5-8731-69fc8942f594-scripts\") pod \"21b4bc88-9c7f-43e5-8731-69fc8942f594\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.606538 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21b4bc88-9c7f-43e5-8731-69fc8942f594-logs\") pod \"21b4bc88-9c7f-43e5-8731-69fc8942f594\" (UID: \"21b4bc88-9c7f-43e5-8731-69fc8942f594\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.606567 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-config\") pod \"e4751187-c98f-4fb5-aba4-63b0f8715b69\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.606602 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-config\") pod \"7c0c93a6-1c5d-49b8-b56b-92460295ec1a\" (UID: \"7c0c93a6-1c5d-49b8-b56b-92460295ec1a\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.606623 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-combined-ca-bundle\") pod \"7c0c93a6-1c5d-49b8-b56b-92460295ec1a\" (UID: \"7c0c93a6-1c5d-49b8-b56b-92460295ec1a\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.606647 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjm5j\" (UniqueName: \"kubernetes.io/projected/e4751187-c98f-4fb5-aba4-63b0f8715b69-kube-api-access-bjm5j\") pod \"e4751187-c98f-4fb5-aba4-63b0f8715b69\" (UID: \"e4751187-c98f-4fb5-aba4-63b0f8715b69\") " Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.607147 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.607160 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.607168 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.607176 4885 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.607186 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21b4bc88-9c7f-43e5-8731-69fc8942f594-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.607194 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t7nr\" (UniqueName: \"kubernetes.io/projected/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21-kube-api-access-2t7nr\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.607876 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21b4bc88-9c7f-43e5-8731-69fc8942f594-logs" (OuterVolumeSpecName: "logs") pod "21b4bc88-9c7f-43e5-8731-69fc8942f594" (UID: "21b4bc88-9c7f-43e5-8731-69fc8942f594"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.633110 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b4bc88-9c7f-43e5-8731-69fc8942f594-scripts" (OuterVolumeSpecName: "scripts") pod "21b4bc88-9c7f-43e5-8731-69fc8942f594" (UID: "21b4bc88-9c7f-43e5-8731-69fc8942f594"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.637187 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4751187-c98f-4fb5-aba4-63b0f8715b69-kube-api-access-bjm5j" (OuterVolumeSpecName: "kube-api-access-bjm5j") pod "e4751187-c98f-4fb5-aba4-63b0f8715b69" (UID: "e4751187-c98f-4fb5-aba4-63b0f8715b69"). InnerVolumeSpecName "kube-api-access-bjm5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.637876 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b4bc88-9c7f-43e5-8731-69fc8942f594-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "21b4bc88-9c7f-43e5-8731-69fc8942f594" (UID: "21b4bc88-9c7f-43e5-8731-69fc8942f594"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.644171 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b4bc88-9c7f-43e5-8731-69fc8942f594-kube-api-access-99h45" (OuterVolumeSpecName: "kube-api-access-99h45") pod "21b4bc88-9c7f-43e5-8731-69fc8942f594" (UID: "21b4bc88-9c7f-43e5-8731-69fc8942f594"). InnerVolumeSpecName "kube-api-access-99h45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.644314 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-kube-api-access-vgdmq" (OuterVolumeSpecName: "kube-api-access-vgdmq") pod "7c0c93a6-1c5d-49b8-b56b-92460295ec1a" (UID: "7c0c93a6-1c5d-49b8-b56b-92460295ec1a"). InnerVolumeSpecName "kube-api-access-vgdmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.659709 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e4751187-c98f-4fb5-aba4-63b0f8715b69" (UID: "e4751187-c98f-4fb5-aba4-63b0f8715b69"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.663981 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c0c93a6-1c5d-49b8-b56b-92460295ec1a" (UID: "7c0c93a6-1c5d-49b8-b56b-92460295ec1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.668349 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e4751187-c98f-4fb5-aba4-63b0f8715b69" (UID: "e4751187-c98f-4fb5-aba4-63b0f8715b69"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.673924 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e4751187-c98f-4fb5-aba4-63b0f8715b69" (UID: "e4751187-c98f-4fb5-aba4-63b0f8715b69"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.683455 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-config" (OuterVolumeSpecName: "config") pod "7c0c93a6-1c5d-49b8-b56b-92460295ec1a" (UID: "7c0c93a6-1c5d-49b8-b56b-92460295ec1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.692236 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e4751187-c98f-4fb5-aba4-63b0f8715b69" (UID: "e4751187-c98f-4fb5-aba4-63b0f8715b69"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.694043 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-config" (OuterVolumeSpecName: "config") pod "e4751187-c98f-4fb5-aba4-63b0f8715b69" (UID: "e4751187-c98f-4fb5-aba4-63b0f8715b69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.708486 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.708524 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.708535 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjm5j\" (UniqueName: \"kubernetes.io/projected/e4751187-c98f-4fb5-aba4-63b0f8715b69-kube-api-access-bjm5j\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.708545 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.708556 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgdmq\" (UniqueName: \"kubernetes.io/projected/7c0c93a6-1c5d-49b8-b56b-92460295ec1a-kube-api-access-vgdmq\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.708564 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.708572 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99h45\" (UniqueName: \"kubernetes.io/projected/21b4bc88-9c7f-43e5-8731-69fc8942f594-kube-api-access-99h45\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.708580 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.708588 4885 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/21b4bc88-9c7f-43e5-8731-69fc8942f594-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.708596 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.708604 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21b4bc88-9c7f-43e5-8731-69fc8942f594-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.708611 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21b4bc88-9c7f-43e5-8731-69fc8942f594-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.708619 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4751187-c98f-4fb5-aba4-63b0f8715b69-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.852115 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55c9479c7c-4wh76" event={"ID":"cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21","Type":"ContainerDied","Data":"9c3fa782a02ea1993374fc085dbd9d1880f9e12fb1ee15eed0629aaa7d3e7d5d"} Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.852139 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55c9479c7c-4wh76" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.853951 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v5n6g" event={"ID":"7c0c93a6-1c5d-49b8-b56b-92460295ec1a","Type":"ContainerDied","Data":"9f94472bbe626b2a31e67cd5e7a0ecbe51d563dadda1eda33a95cc281cf1629a"} Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.853997 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f94472bbe626b2a31e67cd5e7a0ecbe51d563dadda1eda33a95cc281cf1629a" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.854252 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v5n6g" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.855687 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f696c5669-tdhw4" event={"ID":"21b4bc88-9c7f-43e5-8731-69fc8942f594","Type":"ContainerDied","Data":"210b03812f1f56e3c61b30994eedc56d6e834829d317ffe2f5e04ce345b591bc"} Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.855748 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f696c5669-tdhw4" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.858942 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" event={"ID":"e4751187-c98f-4fb5-aba4-63b0f8715b69","Type":"ContainerDied","Data":"c4c2b616138287261b3177cadd03dd721baebf46c6f56f2dbc7756903cd37daf"} Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.859008 4885 scope.go:117] "RemoveContainer" containerID="6b716fb752c626d30def76af4bab6711fd693e2a61883d08d65fc1e7b87a2df6" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.859159 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" Dec 05 20:24:30 crc kubenswrapper[4885]: E1205 20:24:30.861659 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:82006b9c64d4c5f80483cda262d960ce6be4813665158ef1a53ea7734bbe431f\\\"\"" pod="openstack/barbican-db-sync-5szt6" podUID="88521675-6180-4a17-ba7d-6bb9eb07e7dd" Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.979728 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55c9479c7c-4wh76"] Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.989763 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-55c9479c7c-4wh76"] Dec 05 20:24:30 crc kubenswrapper[4885]: I1205 20:24:30.998050 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5cc849d9-vffwb"] Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.004779 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d5cc849d9-vffwb"] Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.021084 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f696c5669-tdhw4"] Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.031268 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7f696c5669-tdhw4"] Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.187088 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b4bc88-9c7f-43e5-8731-69fc8942f594" path="/var/lib/kubelet/pods/21b4bc88-9c7f-43e5-8731-69fc8942f594/volumes" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.189426 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21" path="/var/lib/kubelet/pods/cb5a2a3e-14cc-4b4c-be3b-f645a41ddf21/volumes" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.190235 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4751187-c98f-4fb5-aba4-63b0f8715b69" path="/var/lib/kubelet/pods/e4751187-c98f-4fb5-aba4-63b0f8715b69/volumes" Dec 05 20:24:31 crc kubenswrapper[4885]: E1205 20:24:31.713654 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2" Dec 05 20:24:31 crc kubenswrapper[4885]: E1205 20:24:31.714319 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wzwlx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-6jq57_openstack(e4a908e8-64e1-4fec-b455-66527f7efee3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:24:31 crc kubenswrapper[4885]: E1205 20:24:31.716529 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-6jq57" podUID="e4a908e8-64e1-4fec-b455-66527f7efee3" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.768693 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.836901 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfd8x\" (UniqueName: \"kubernetes.io/projected/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-kube-api-access-mfd8x\") pod \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.836961 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-config-data\") pod \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.837044 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-scripts\") pod \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.837064 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-horizon-secret-key\") pod \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.837128 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-logs\") pod \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\" (UID: \"775ea4fb-9967-4ddb-bfc2-874afb08f0c1\") " Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.837821 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-logs" (OuterVolumeSpecName: "logs") pod "775ea4fb-9967-4ddb-bfc2-874afb08f0c1" (UID: "775ea4fb-9967-4ddb-bfc2-874afb08f0c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.838122 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-config-data" (OuterVolumeSpecName: "config-data") pod "775ea4fb-9967-4ddb-bfc2-874afb08f0c1" (UID: "775ea4fb-9967-4ddb-bfc2-874afb08f0c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.838347 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-scripts" (OuterVolumeSpecName: "scripts") pod "775ea4fb-9967-4ddb-bfc2-874afb08f0c1" (UID: "775ea4fb-9967-4ddb-bfc2-874afb08f0c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.861267 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "775ea4fb-9967-4ddb-bfc2-874afb08f0c1" (UID: "775ea4fb-9967-4ddb-bfc2-874afb08f0c1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.867124 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-kube-api-access-mfd8x" (OuterVolumeSpecName: "kube-api-access-mfd8x") pod "775ea4fb-9967-4ddb-bfc2-874afb08f0c1" (UID: "775ea4fb-9967-4ddb-bfc2-874afb08f0c1"). InnerVolumeSpecName "kube-api-access-mfd8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.884507 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54d9b68659-r2zdz"] Dec 05 20:24:31 crc kubenswrapper[4885]: E1205 20:24:31.884876 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0c93a6-1c5d-49b8-b56b-92460295ec1a" containerName="neutron-db-sync" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.884888 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0c93a6-1c5d-49b8-b56b-92460295ec1a" containerName="neutron-db-sync" Dec 05 20:24:31 crc kubenswrapper[4885]: E1205 20:24:31.884907 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4751187-c98f-4fb5-aba4-63b0f8715b69" containerName="init" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.884912 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4751187-c98f-4fb5-aba4-63b0f8715b69" containerName="init" Dec 05 20:24:31 crc kubenswrapper[4885]: E1205 20:24:31.884926 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4751187-c98f-4fb5-aba4-63b0f8715b69" containerName="dnsmasq-dns" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.884932 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4751187-c98f-4fb5-aba4-63b0f8715b69" containerName="dnsmasq-dns" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.885088 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4751187-c98f-4fb5-aba4-63b0f8715b69" containerName="dnsmasq-dns" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.885105 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c0c93a6-1c5d-49b8-b56b-92460295ec1a" containerName="neutron-db-sync" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.886012 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.898776 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59cc747f79-h5ns4" event={"ID":"775ea4fb-9967-4ddb-bfc2-874afb08f0c1","Type":"ContainerDied","Data":"3649817de0f7e609437dc321d32e479869bbb15a8df12d2275104d0f01897de2"} Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.898821 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59cc747f79-h5ns4" Dec 05 20:24:31 crc kubenswrapper[4885]: E1205 20:24:31.902390 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2\\\"\"" pod="openstack/cinder-db-sync-6jq57" podUID="e4a908e8-64e1-4fec-b455-66527f7efee3" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.912285 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54d9b68659-r2zdz"] Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.938364 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-ovsdbserver-sb\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.938522 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-config\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.938644 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-ovsdbserver-nb\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.938773 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-dns-svc\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.938863 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-dns-swift-storage-0\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.938932 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46zc7\" (UniqueName: \"kubernetes.io/projected/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-kube-api-access-46zc7\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.939070 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.939126 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfd8x\" (UniqueName: \"kubernetes.io/projected/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-kube-api-access-mfd8x\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.939200 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.939259 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.939311 4885 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/775ea4fb-9967-4ddb-bfc2-874afb08f0c1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.996134 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6dd697974b-njsvr"] Dec 05 20:24:31 crc kubenswrapper[4885]: I1205 20:24:31.997664 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.007507 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6dd697974b-njsvr"] Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.008871 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.008871 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sq8z4" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.009156 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.009316 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.023105 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59cc747f79-h5ns4"] Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.033144 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-59cc747f79-h5ns4"] Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.042851 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-config\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.043002 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-httpd-config\") pod \"neutron-6dd697974b-njsvr\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.043055 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-ovsdbserver-nb\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.043084 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-config\") pod \"neutron-6dd697974b-njsvr\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.043115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbcsg\" (UniqueName: \"kubernetes.io/projected/2037cb2f-46ad-4a89-b430-91dd3568954f-kube-api-access-kbcsg\") pod \"neutron-6dd697974b-njsvr\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.043166 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-combined-ca-bundle\") pod \"neutron-6dd697974b-njsvr\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.043194 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-dns-svc\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.043230 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-dns-swift-storage-0\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.043313 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46zc7\" (UniqueName: \"kubernetes.io/projected/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-kube-api-access-46zc7\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.043460 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-ovndb-tls-certs\") pod \"neutron-6dd697974b-njsvr\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.043563 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-ovsdbserver-sb\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.043809 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-config\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.044572 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-ovsdbserver-sb\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.044598 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-dns-swift-storage-0\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.045238 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-ovsdbserver-nb\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.045740 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-dns-svc\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.062727 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46zc7\" (UniqueName: \"kubernetes.io/projected/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-kube-api-access-46zc7\") pod \"dnsmasq-dns-54d9b68659-r2zdz\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.145131 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-combined-ca-bundle\") pod \"neutron-6dd697974b-njsvr\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.145250 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-ovndb-tls-certs\") pod \"neutron-6dd697974b-njsvr\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.145310 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-httpd-config\") pod \"neutron-6dd697974b-njsvr\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.145332 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-config\") pod \"neutron-6dd697974b-njsvr\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.145353 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbcsg\" (UniqueName: \"kubernetes.io/projected/2037cb2f-46ad-4a89-b430-91dd3568954f-kube-api-access-kbcsg\") pod \"neutron-6dd697974b-njsvr\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.148976 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-combined-ca-bundle\") pod \"neutron-6dd697974b-njsvr\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.152805 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-httpd-config\") pod \"neutron-6dd697974b-njsvr\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.152962 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-ovndb-tls-certs\") pod \"neutron-6dd697974b-njsvr\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.158899 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-config\") pod \"neutron-6dd697974b-njsvr\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.163702 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbcsg\" (UniqueName: \"kubernetes.io/projected/2037cb2f-46ad-4a89-b430-91dd3568954f-kube-api-access-kbcsg\") pod \"neutron-6dd697974b-njsvr\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.237294 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7ddb869454-vvfd9"] Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.242540 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:32 crc kubenswrapper[4885]: E1205 20:24:32.320841 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:f17b61f2318b74648e174d73dd31deee6c0d1434605c9f32707aedf2f4378957" Dec 05 20:24:32 crc kubenswrapper[4885]: E1205 20:24:32.321001 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-notification-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:f17b61f2318b74648e174d73dd31deee6c0d1434605c9f32707aedf2f4378957,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5f5h586h5fbh574h55fh99h5d9h677h57bhdch689h66bh9h67dh5ddh86h544h579h677h647h658h659h596h5b6h5cch5dh5fh565h5fch5b5hcdhd6q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-notification-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6tm99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/notificationhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(a91533ae-4113-4680-8fb9-c0a3fa74daa8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.328197 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.339223 4885 scope.go:117] "RemoveContainer" containerID="334a3762850280b82410ae5a389ee42f5c490420e8349e4d6ee4895f97252236" Dec 05 20:24:32 crc kubenswrapper[4885]: W1205 20:24:32.351296 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58faa50e_ede0_4c8e_ad2d_d76b7e1feb2a.slice/crio-83f06635eceaaf2d00f9fd79d4ff304fde25ce7920a71f925b0717b58d2cf852 WatchSource:0}: Error finding container 83f06635eceaaf2d00f9fd79d4ff304fde25ce7920a71f925b0717b58d2cf852: Status 404 returned error can't find the container with id 83f06635eceaaf2d00f9fd79d4ff304fde25ce7920a71f925b0717b58d2cf852 Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.821413 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d9999949d-c22ch"] Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.914535 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cct8h"] Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.920796 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d9999949d-c22ch" event={"ID":"d0f84b71-1907-4f71-833d-1e5561a4f0f8","Type":"ContainerStarted","Data":"338fca95bcfea4beabf86d6e07210a258f2b6624454a6af02650dbecc8938cec"} Dec 05 20:24:32 crc kubenswrapper[4885]: I1205 20:24:32.923929 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ddb869454-vvfd9" event={"ID":"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a","Type":"ContainerStarted","Data":"83f06635eceaaf2d00f9fd79d4ff304fde25ce7920a71f925b0717b58d2cf852"} Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.048598 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54d9b68659-r2zdz"] Dec 05 20:24:33 crc kubenswrapper[4885]: W1205 20:24:33.053195 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a0bed2d_1fb8_4e60_8d5b_a468aab8985b.slice/crio-49b881bc8876759a5895dab48d2450d1d5d5bb5e21260ae081ea1bf19108b228 WatchSource:0}: Error finding container 49b881bc8876759a5895dab48d2450d1d5d5bb5e21260ae081ea1bf19108b228: Status 404 returned error can't find the container with id 49b881bc8876759a5895dab48d2450d1d5d5bb5e21260ae081ea1bf19108b228 Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.170881 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6dd697974b-njsvr"] Dec 05 20:24:33 crc kubenswrapper[4885]: W1205 20:24:33.174639 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2037cb2f_46ad_4a89_b430_91dd3568954f.slice/crio-8d3b72e6e0e72f68b71baed20e96d8fe31690cae4d9d5608f4d79c762ab6d106 WatchSource:0}: Error finding container 8d3b72e6e0e72f68b71baed20e96d8fe31690cae4d9d5608f4d79c762ab6d106: Status 404 returned error can't find the container with id 8d3b72e6e0e72f68b71baed20e96d8fe31690cae4d9d5608f4d79c762ab6d106 Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.211754 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="775ea4fb-9967-4ddb-bfc2-874afb08f0c1" path="/var/lib/kubelet/pods/775ea4fb-9967-4ddb-bfc2-874afb08f0c1/volumes" Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.944331 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ddb869454-vvfd9" event={"ID":"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a","Type":"ContainerStarted","Data":"7d4897e7e9fe34f5c8e863c727990aaf2e3ffa96de3ab3cb8b2927f061b528b5"} Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.944910 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ddb869454-vvfd9" event={"ID":"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a","Type":"ContainerStarted","Data":"c4d686985a3af471508ab1b5d0a4c3ed14ad0ec2a8a4399057c6c1c976215e97"} Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.953155 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-w6258" event={"ID":"9be03938-1d91-45a5-beba-a54b318fc799","Type":"ContainerStarted","Data":"7eda14d121200765d3e9ceee44920c17d2cd102f764cced4e39ea638bfb7c831"} Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.956203 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d9999949d-c22ch" event={"ID":"d0f84b71-1907-4f71-833d-1e5561a4f0f8","Type":"ContainerStarted","Data":"aa1fa3292c4e88c3a6bfd98d5afb97b5e402e493ccfa2c6b5c970fee7ceb3610"} Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.958180 4885 generic.go:334] "Generic (PLEG): container finished" podID="8a0bed2d-1fb8-4e60-8d5b-a468aab8985b" containerID="3cf8f958c8e116487ec9ee4b025648eb7d764644432db690bd070e7f5f9d246f" exitCode=0 Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.958348 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" event={"ID":"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b","Type":"ContainerDied","Data":"3cf8f958c8e116487ec9ee4b025648eb7d764644432db690bd070e7f5f9d246f"} Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.958441 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" event={"ID":"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b","Type":"ContainerStarted","Data":"49b881bc8876759a5895dab48d2450d1d5d5bb5e21260ae081ea1bf19108b228"} Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.974430 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cct8h" event={"ID":"c9cd60a9-9ff8-4b35-9069-4e406b9771e1","Type":"ContainerStarted","Data":"79e7b3c2ed82726f00d2118846fb01e953e32a733ac03d1fb6e7186bad673750"} Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.974473 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cct8h" event={"ID":"c9cd60a9-9ff8-4b35-9069-4e406b9771e1","Type":"ContainerStarted","Data":"5d66efcd7b1c3a3d11c0e7645bf344c0d075752a8554b8226be62a701fcbf885"} Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.980334 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dsgxp" event={"ID":"af42085d-f7f5-4dd5-86d1-7019ba4d0888","Type":"ContainerStarted","Data":"733b7abb9b6783fa9892ee608b15a56dccdccc24196a35763797acfd3fe31d85"} Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.983633 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dd697974b-njsvr" event={"ID":"2037cb2f-46ad-4a89-b430-91dd3568954f","Type":"ContainerStarted","Data":"0c3e9a5fb95fc92032704efe76ee7c958740088a5f8f0333b8d3b9c0ac96d506"} Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.983669 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dd697974b-njsvr" event={"ID":"2037cb2f-46ad-4a89-b430-91dd3568954f","Type":"ContainerStarted","Data":"24d7e5c52698dcceb0e5a78c1a2123b1e1bacbf374c670ffc139597735ac4ffa"} Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.983681 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dd697974b-njsvr" event={"ID":"2037cb2f-46ad-4a89-b430-91dd3568954f","Type":"ContainerStarted","Data":"8d3b72e6e0e72f68b71baed20e96d8fe31690cae4d9d5608f4d79c762ab6d106"} Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.984361 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:24:33 crc kubenswrapper[4885]: I1205 20:24:33.993432 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7ddb869454-vvfd9" podStartSLOduration=27.369710154 podStartE2EDuration="27.99340996s" podCreationTimestamp="2025-12-05 20:24:06 +0000 UTC" firstStartedPulling="2025-12-05 20:24:32.380505252 +0000 UTC m=+1137.677320913" lastFinishedPulling="2025-12-05 20:24:33.004205058 +0000 UTC m=+1138.301020719" observedRunningTime="2025-12-05 20:24:33.968372471 +0000 UTC m=+1139.265188132" watchObservedRunningTime="2025-12-05 20:24:33.99340996 +0000 UTC m=+1139.290225621" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.011224 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-w6258" podStartSLOduration=2.592905286 podStartE2EDuration="37.011203718s" podCreationTimestamp="2025-12-05 20:23:57 +0000 UTC" firstStartedPulling="2025-12-05 20:23:59.293239523 +0000 UTC m=+1104.590055184" lastFinishedPulling="2025-12-05 20:24:33.711537955 +0000 UTC m=+1139.008353616" observedRunningTime="2025-12-05 20:24:34.006155333 +0000 UTC m=+1139.302970994" watchObservedRunningTime="2025-12-05 20:24:34.011203718 +0000 UTC m=+1139.308019379" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.045552 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-cct8h" podStartSLOduration=18.045533253 podStartE2EDuration="18.045533253s" podCreationTimestamp="2025-12-05 20:24:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:24:34.034696639 +0000 UTC m=+1139.331512330" watchObservedRunningTime="2025-12-05 20:24:34.045533253 +0000 UTC m=+1139.342348914" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.066040 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6dd697974b-njsvr" podStartSLOduration=3.066000402 podStartE2EDuration="3.066000402s" podCreationTimestamp="2025-12-05 20:24:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:24:34.058209813 +0000 UTC m=+1139.355025474" watchObservedRunningTime="2025-12-05 20:24:34.066000402 +0000 UTC m=+1139.362816063" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.088444 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-dsgxp" podStartSLOduration=3.555750046 podStartE2EDuration="1m7.088426891s" podCreationTimestamp="2025-12-05 20:23:27 +0000 UTC" firstStartedPulling="2025-12-05 20:23:28.120110194 +0000 UTC m=+1073.416925855" lastFinishedPulling="2025-12-05 20:24:31.652787039 +0000 UTC m=+1136.949602700" observedRunningTime="2025-12-05 20:24:34.085179931 +0000 UTC m=+1139.381995592" watchObservedRunningTime="2025-12-05 20:24:34.088426891 +0000 UTC m=+1139.385242552" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.229327 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-94b44cc8f-5tpnj"] Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.230930 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.235072 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.235260 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.241991 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-94b44cc8f-5tpnj"] Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.300663 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-config\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.300707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74hkv\" (UniqueName: \"kubernetes.io/projected/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-kube-api-access-74hkv\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.300733 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-ovndb-tls-certs\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.300852 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-internal-tls-certs\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.300928 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-combined-ca-bundle\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.300996 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-httpd-config\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.301268 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-public-tls-certs\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.402600 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-public-tls-certs\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.402646 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-config\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.402666 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74hkv\" (UniqueName: \"kubernetes.io/projected/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-kube-api-access-74hkv\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.402691 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-ovndb-tls-certs\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.402720 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-internal-tls-certs\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.402745 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-combined-ca-bundle\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.402774 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-httpd-config\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.415338 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-public-tls-certs\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.417290 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-httpd-config\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.417879 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-combined-ca-bundle\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.418632 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-config\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.423817 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-ovndb-tls-certs\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.427574 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-internal-tls-certs\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.437295 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74hkv\" (UniqueName: \"kubernetes.io/projected/0437ab7b-cd9d-46e8-9bca-7acdbefda1be-kube-api-access-74hkv\") pod \"neutron-94b44cc8f-5tpnj\" (UID: \"0437ab7b-cd9d-46e8-9bca-7acdbefda1be\") " pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.542428 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d5cc849d9-vffwb" podUID="e4751187-c98f-4fb5-aba4-63b0f8715b69" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.550445 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.996483 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dd697974b-njsvr_2037cb2f-46ad-4a89-b430-91dd3568954f/neutron-httpd/0.log" Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.997262 4885 generic.go:334] "Generic (PLEG): container finished" podID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerID="0c3e9a5fb95fc92032704efe76ee7c958740088a5f8f0333b8d3b9c0ac96d506" exitCode=1 Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.997721 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dd697974b-njsvr" event={"ID":"2037cb2f-46ad-4a89-b430-91dd3568954f","Type":"ContainerDied","Data":"0c3e9a5fb95fc92032704efe76ee7c958740088a5f8f0333b8d3b9c0ac96d506"} Dec 05 20:24:34 crc kubenswrapper[4885]: I1205 20:24:34.997931 4885 scope.go:117] "RemoveContainer" containerID="0c3e9a5fb95fc92032704efe76ee7c958740088a5f8f0333b8d3b9c0ac96d506" Dec 05 20:24:35 crc kubenswrapper[4885]: I1205 20:24:35.001035 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d9999949d-c22ch" event={"ID":"d0f84b71-1907-4f71-833d-1e5561a4f0f8","Type":"ContainerStarted","Data":"f8aa1b9e8494e1f1159a304f5204c541f6dc0db3da038062dd9ff3ddbef35303"} Dec 05 20:24:35 crc kubenswrapper[4885]: I1205 20:24:35.007947 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" event={"ID":"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b","Type":"ContainerStarted","Data":"f9fef873318696b70a25f2d57ffb86aba4a6e93dfa65a84129702236af8bb663"} Dec 05 20:24:35 crc kubenswrapper[4885]: I1205 20:24:35.008239 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:35 crc kubenswrapper[4885]: I1205 20:24:35.049799 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7d9999949d-c22ch" podStartSLOduration=28.354621275 podStartE2EDuration="29.049778468s" podCreationTimestamp="2025-12-05 20:24:06 +0000 UTC" firstStartedPulling="2025-12-05 20:24:32.846307523 +0000 UTC m=+1138.143123184" lastFinishedPulling="2025-12-05 20:24:33.541464716 +0000 UTC m=+1138.838280377" observedRunningTime="2025-12-05 20:24:35.03584894 +0000 UTC m=+1140.332664601" watchObservedRunningTime="2025-12-05 20:24:35.049778468 +0000 UTC m=+1140.346594129" Dec 05 20:24:35 crc kubenswrapper[4885]: I1205 20:24:35.076578 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" podStartSLOduration=4.076559801 podStartE2EDuration="4.076559801s" podCreationTimestamp="2025-12-05 20:24:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:24:35.065077808 +0000 UTC m=+1140.361893469" watchObservedRunningTime="2025-12-05 20:24:35.076559801 +0000 UTC m=+1140.373375462" Dec 05 20:24:35 crc kubenswrapper[4885]: I1205 20:24:35.216860 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-94b44cc8f-5tpnj"] Dec 05 20:24:35 crc kubenswrapper[4885]: W1205 20:24:35.232647 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0437ab7b_cd9d_46e8_9bca_7acdbefda1be.slice/crio-98225b6c30e334af106fb05cf78623e8292c75b088fbd6409ccd8f46e827e313 WatchSource:0}: Error finding container 98225b6c30e334af106fb05cf78623e8292c75b088fbd6409ccd8f46e827e313: Status 404 returned error can't find the container with id 98225b6c30e334af106fb05cf78623e8292c75b088fbd6409ccd8f46e827e313 Dec 05 20:24:36 crc kubenswrapper[4885]: I1205 20:24:36.015365 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-94b44cc8f-5tpnj" event={"ID":"0437ab7b-cd9d-46e8-9bca-7acdbefda1be","Type":"ContainerStarted","Data":"98225b6c30e334af106fb05cf78623e8292c75b088fbd6409ccd8f46e827e313"} Dec 05 20:24:36 crc kubenswrapper[4885]: I1205 20:24:36.874006 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:36 crc kubenswrapper[4885]: I1205 20:24:36.874297 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:24:37 crc kubenswrapper[4885]: I1205 20:24:37.031362 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-94b44cc8f-5tpnj" event={"ID":"0437ab7b-cd9d-46e8-9bca-7acdbefda1be","Type":"ContainerStarted","Data":"2db530c652d4976e546a1ad5640f33cf9262f6cf10409175e6f35cab3391b507"} Dec 05 20:24:37 crc kubenswrapper[4885]: I1205 20:24:37.091209 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:37 crc kubenswrapper[4885]: I1205 20:24:37.091260 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:24:37 crc kubenswrapper[4885]: I1205 20:24:37.147118 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dd697974b-njsvr_2037cb2f-46ad-4a89-b430-91dd3568954f/neutron-httpd/1.log" Dec 05 20:24:37 crc kubenswrapper[4885]: I1205 20:24:37.147919 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dd697974b-njsvr_2037cb2f-46ad-4a89-b430-91dd3568954f/neutron-httpd/0.log" Dec 05 20:24:37 crc kubenswrapper[4885]: I1205 20:24:37.148535 4885 generic.go:334] "Generic (PLEG): container finished" podID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerID="0c8c1c70c31d9755459ef8b4e3697cb4709f996f244e81647bc735f33560ce0a" exitCode=1 Dec 05 20:24:37 crc kubenswrapper[4885]: I1205 20:24:37.148635 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dd697974b-njsvr" event={"ID":"2037cb2f-46ad-4a89-b430-91dd3568954f","Type":"ContainerDied","Data":"0c8c1c70c31d9755459ef8b4e3697cb4709f996f244e81647bc735f33560ce0a"} Dec 05 20:24:37 crc kubenswrapper[4885]: I1205 20:24:37.148675 4885 scope.go:117] "RemoveContainer" containerID="0c3e9a5fb95fc92032704efe76ee7c958740088a5f8f0333b8d3b9c0ac96d506" Dec 05 20:24:37 crc kubenswrapper[4885]: I1205 20:24:37.149132 4885 scope.go:117] "RemoveContainer" containerID="0c8c1c70c31d9755459ef8b4e3697cb4709f996f244e81647bc735f33560ce0a" Dec 05 20:24:37 crc kubenswrapper[4885]: E1205 20:24:37.149316 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=neutron-httpd pod=neutron-6dd697974b-njsvr_openstack(2037cb2f-46ad-4a89-b430-91dd3568954f)\"" pod="openstack/neutron-6dd697974b-njsvr" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" Dec 05 20:24:38 crc kubenswrapper[4885]: I1205 20:24:38.156331 4885 scope.go:117] "RemoveContainer" containerID="0c8c1c70c31d9755459ef8b4e3697cb4709f996f244e81647bc735f33560ce0a" Dec 05 20:24:38 crc kubenswrapper[4885]: E1205 20:24:38.156576 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=neutron-httpd pod=neutron-6dd697974b-njsvr_openstack(2037cb2f-46ad-4a89-b430-91dd3568954f)\"" pod="openstack/neutron-6dd697974b-njsvr" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" Dec 05 20:24:39 crc kubenswrapper[4885]: I1205 20:24:39.169037 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-94b44cc8f-5tpnj" event={"ID":"0437ab7b-cd9d-46e8-9bca-7acdbefda1be","Type":"ContainerStarted","Data":"d0826fbdab1ced58e564e355cc1bdf248e0d8d2f80f1bc83a5d3a16d3ef33589"} Dec 05 20:24:39 crc kubenswrapper[4885]: I1205 20:24:39.169398 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:24:39 crc kubenswrapper[4885]: I1205 20:24:39.198077 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-94b44cc8f-5tpnj" podStartSLOduration=5.198062314 podStartE2EDuration="5.198062314s" podCreationTimestamp="2025-12-05 20:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:24:39.192975928 +0000 UTC m=+1144.489791579" watchObservedRunningTime="2025-12-05 20:24:39.198062314 +0000 UTC m=+1144.494877975" Dec 05 20:24:40 crc kubenswrapper[4885]: I1205 20:24:40.180656 4885 generic.go:334] "Generic (PLEG): container finished" podID="c9cd60a9-9ff8-4b35-9069-4e406b9771e1" containerID="79e7b3c2ed82726f00d2118846fb01e953e32a733ac03d1fb6e7186bad673750" exitCode=0 Dec 05 20:24:40 crc kubenswrapper[4885]: I1205 20:24:40.180737 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cct8h" event={"ID":"c9cd60a9-9ff8-4b35-9069-4e406b9771e1","Type":"ContainerDied","Data":"79e7b3c2ed82726f00d2118846fb01e953e32a733ac03d1fb6e7186bad673750"} Dec 05 20:24:40 crc kubenswrapper[4885]: I1205 20:24:40.188520 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dd697974b-njsvr_2037cb2f-46ad-4a89-b430-91dd3568954f/neutron-httpd/1.log" Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.201610 4885 generic.go:334] "Generic (PLEG): container finished" podID="9be03938-1d91-45a5-beba-a54b318fc799" containerID="7eda14d121200765d3e9ceee44920c17d2cd102f764cced4e39ea638bfb7c831" exitCode=0 Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.201817 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-w6258" event={"ID":"9be03938-1d91-45a5-beba-a54b318fc799","Type":"ContainerDied","Data":"7eda14d121200765d3e9ceee44920c17d2cd102f764cced4e39ea638bfb7c831"} Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.204465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a91533ae-4113-4680-8fb9-c0a3fa74daa8","Type":"ContainerStarted","Data":"fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6"} Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.525599 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.648437 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-credential-keys\") pod \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.648489 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-combined-ca-bundle\") pod \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.648556 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7zvj\" (UniqueName: \"kubernetes.io/projected/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-kube-api-access-c7zvj\") pod \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.648588 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-scripts\") pod \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.648676 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-fernet-keys\") pod \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.648702 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-config-data\") pod \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\" (UID: \"c9cd60a9-9ff8-4b35-9069-4e406b9771e1\") " Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.669308 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-scripts" (OuterVolumeSpecName: "scripts") pod "c9cd60a9-9ff8-4b35-9069-4e406b9771e1" (UID: "c9cd60a9-9ff8-4b35-9069-4e406b9771e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.670189 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c9cd60a9-9ff8-4b35-9069-4e406b9771e1" (UID: "c9cd60a9-9ff8-4b35-9069-4e406b9771e1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.670481 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c9cd60a9-9ff8-4b35-9069-4e406b9771e1" (UID: "c9cd60a9-9ff8-4b35-9069-4e406b9771e1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.670582 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-kube-api-access-c7zvj" (OuterVolumeSpecName: "kube-api-access-c7zvj") pod "c9cd60a9-9ff8-4b35-9069-4e406b9771e1" (UID: "c9cd60a9-9ff8-4b35-9069-4e406b9771e1"). InnerVolumeSpecName "kube-api-access-c7zvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.679856 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9cd60a9-9ff8-4b35-9069-4e406b9771e1" (UID: "c9cd60a9-9ff8-4b35-9069-4e406b9771e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.682941 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-config-data" (OuterVolumeSpecName: "config-data") pod "c9cd60a9-9ff8-4b35-9069-4e406b9771e1" (UID: "c9cd60a9-9ff8-4b35-9069-4e406b9771e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.755147 4885 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.755197 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.755208 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7zvj\" (UniqueName: \"kubernetes.io/projected/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-kube-api-access-c7zvj\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.755222 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.755232 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:41 crc kubenswrapper[4885]: I1205 20:24:41.755242 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9cd60a9-9ff8-4b35-9069-4e406b9771e1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.214188 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cct8h" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.214185 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cct8h" event={"ID":"c9cd60a9-9ff8-4b35-9069-4e406b9771e1","Type":"ContainerDied","Data":"5d66efcd7b1c3a3d11c0e7645bf344c0d075752a8554b8226be62a701fcbf885"} Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.214580 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d66efcd7b1c3a3d11c0e7645bf344c0d075752a8554b8226be62a701fcbf885" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.244664 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.321776 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f7cc5f48f-j8zf9"] Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.322005 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" podUID="91e68318-2de7-47b6-b2fd-c5932959f0ce" containerName="dnsmasq-dns" containerID="cri-o://6c984a17f8fe658b2b3580baa9a6eb11df2c0e26fc5c8155764c4378e2119e62" gracePeriod=10 Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.399846 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7bdf6f4c4b-9n2vm"] Dec 05 20:24:42 crc kubenswrapper[4885]: E1205 20:24:42.400442 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cd60a9-9ff8-4b35-9069-4e406b9771e1" containerName="keystone-bootstrap" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.400516 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cd60a9-9ff8-4b35-9069-4e406b9771e1" containerName="keystone-bootstrap" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.400803 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9cd60a9-9ff8-4b35-9069-4e406b9771e1" containerName="keystone-bootstrap" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.402275 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.409666 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7bdf6f4c4b-9n2vm"] Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.412614 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.412991 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.415216 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-c6lcf" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.415398 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.415554 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.422128 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.595916 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-scripts\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.596032 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-combined-ca-bundle\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.596072 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-internal-tls-certs\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.596087 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-fernet-keys\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.596122 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-config-data\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.596151 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-credential-keys\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.596176 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-public-tls-certs\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.596203 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqhct\" (UniqueName: \"kubernetes.io/projected/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-kube-api-access-sqhct\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.697646 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-scripts\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.697709 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-combined-ca-bundle\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.697770 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-internal-tls-certs\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.697792 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-fernet-keys\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.697832 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-config-data\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.697875 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-credential-keys\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.697915 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-public-tls-certs\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.697951 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqhct\" (UniqueName: \"kubernetes.io/projected/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-kube-api-access-sqhct\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.706647 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-scripts\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.706796 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-config-data\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.731355 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-combined-ca-bundle\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.732118 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-internal-tls-certs\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.732824 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-public-tls-certs\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.732844 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-fernet-keys\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.734356 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-credential-keys\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.734623 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqhct\" (UniqueName: \"kubernetes.io/projected/a8ffb925-d20c-4c24-a3b2-158d9c347b6b-kube-api-access-sqhct\") pod \"keystone-7bdf6f4c4b-9n2vm\" (UID: \"a8ffb925-d20c-4c24-a3b2-158d9c347b6b\") " pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.827260 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.862489 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-w6258" Dec 05 20:24:42 crc kubenswrapper[4885]: I1205 20:24:42.969333 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.007899 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-scripts\") pod \"9be03938-1d91-45a5-beba-a54b318fc799\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.008561 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lplrf\" (UniqueName: \"kubernetes.io/projected/9be03938-1d91-45a5-beba-a54b318fc799-kube-api-access-lplrf\") pod \"9be03938-1d91-45a5-beba-a54b318fc799\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.008634 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9be03938-1d91-45a5-beba-a54b318fc799-logs\") pod \"9be03938-1d91-45a5-beba-a54b318fc799\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.008700 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-config-data\") pod \"9be03938-1d91-45a5-beba-a54b318fc799\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.008718 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-combined-ca-bundle\") pod \"9be03938-1d91-45a5-beba-a54b318fc799\" (UID: \"9be03938-1d91-45a5-beba-a54b318fc799\") " Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.013119 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9be03938-1d91-45a5-beba-a54b318fc799-logs" (OuterVolumeSpecName: "logs") pod "9be03938-1d91-45a5-beba-a54b318fc799" (UID: "9be03938-1d91-45a5-beba-a54b318fc799"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.022001 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9be03938-1d91-45a5-beba-a54b318fc799-kube-api-access-lplrf" (OuterVolumeSpecName: "kube-api-access-lplrf") pod "9be03938-1d91-45a5-beba-a54b318fc799" (UID: "9be03938-1d91-45a5-beba-a54b318fc799"). InnerVolumeSpecName "kube-api-access-lplrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.034283 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-scripts" (OuterVolumeSpecName: "scripts") pod "9be03938-1d91-45a5-beba-a54b318fc799" (UID: "9be03938-1d91-45a5-beba-a54b318fc799"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.068601 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-config-data" (OuterVolumeSpecName: "config-data") pod "9be03938-1d91-45a5-beba-a54b318fc799" (UID: "9be03938-1d91-45a5-beba-a54b318fc799"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.097674 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9be03938-1d91-45a5-beba-a54b318fc799" (UID: "9be03938-1d91-45a5-beba-a54b318fc799"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.110167 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-dns-swift-storage-0\") pod \"91e68318-2de7-47b6-b2fd-c5932959f0ce\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.110290 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-ovsdbserver-nb\") pod \"91e68318-2de7-47b6-b2fd-c5932959f0ce\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.110320 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-ovsdbserver-sb\") pod \"91e68318-2de7-47b6-b2fd-c5932959f0ce\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.110376 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mdrx\" (UniqueName: \"kubernetes.io/projected/91e68318-2de7-47b6-b2fd-c5932959f0ce-kube-api-access-2mdrx\") pod \"91e68318-2de7-47b6-b2fd-c5932959f0ce\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.110408 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-dns-svc\") pod \"91e68318-2de7-47b6-b2fd-c5932959f0ce\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.110464 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-config\") pod \"91e68318-2de7-47b6-b2fd-c5932959f0ce\" (UID: \"91e68318-2de7-47b6-b2fd-c5932959f0ce\") " Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.110949 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lplrf\" (UniqueName: \"kubernetes.io/projected/9be03938-1d91-45a5-beba-a54b318fc799-kube-api-access-lplrf\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.110975 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9be03938-1d91-45a5-beba-a54b318fc799-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.110987 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.110997 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.111009 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9be03938-1d91-45a5-beba-a54b318fc799-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.115850 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91e68318-2de7-47b6-b2fd-c5932959f0ce-kube-api-access-2mdrx" (OuterVolumeSpecName: "kube-api-access-2mdrx") pod "91e68318-2de7-47b6-b2fd-c5932959f0ce" (UID: "91e68318-2de7-47b6-b2fd-c5932959f0ce"). InnerVolumeSpecName "kube-api-access-2mdrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.165503 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "91e68318-2de7-47b6-b2fd-c5932959f0ce" (UID: "91e68318-2de7-47b6-b2fd-c5932959f0ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.174148 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "91e68318-2de7-47b6-b2fd-c5932959f0ce" (UID: "91e68318-2de7-47b6-b2fd-c5932959f0ce"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.184614 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "91e68318-2de7-47b6-b2fd-c5932959f0ce" (UID: "91e68318-2de7-47b6-b2fd-c5932959f0ce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.212721 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mdrx\" (UniqueName: \"kubernetes.io/projected/91e68318-2de7-47b6-b2fd-c5932959f0ce-kube-api-access-2mdrx\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.212748 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.212757 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.212767 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.212798 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "91e68318-2de7-47b6-b2fd-c5932959f0ce" (UID: "91e68318-2de7-47b6-b2fd-c5932959f0ce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.220716 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-config" (OuterVolumeSpecName: "config") pod "91e68318-2de7-47b6-b2fd-c5932959f0ce" (UID: "91e68318-2de7-47b6-b2fd-c5932959f0ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.231963 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-w6258" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.232168 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-w6258" event={"ID":"9be03938-1d91-45a5-beba-a54b318fc799","Type":"ContainerDied","Data":"e2ae51cfd01e8b75e47b0396c35ea777f5c06fb3135919535196d0f715de7306"} Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.232222 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2ae51cfd01e8b75e47b0396c35ea777f5c06fb3135919535196d0f715de7306" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.237803 4885 generic.go:334] "Generic (PLEG): container finished" podID="91e68318-2de7-47b6-b2fd-c5932959f0ce" containerID="6c984a17f8fe658b2b3580baa9a6eb11df2c0e26fc5c8155764c4378e2119e62" exitCode=0 Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.238162 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" event={"ID":"91e68318-2de7-47b6-b2fd-c5932959f0ce","Type":"ContainerDied","Data":"6c984a17f8fe658b2b3580baa9a6eb11df2c0e26fc5c8155764c4378e2119e62"} Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.238196 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" event={"ID":"91e68318-2de7-47b6-b2fd-c5932959f0ce","Type":"ContainerDied","Data":"c9325c3871d24c981c65870306c84307a66263d501cf0b128703cfdb722588a5"} Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.238217 4885 scope.go:117] "RemoveContainer" containerID="6c984a17f8fe658b2b3580baa9a6eb11df2c0e26fc5c8155764c4378e2119e62" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.238373 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f7cc5f48f-j8zf9" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.303535 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f7cc5f48f-j8zf9"] Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.312809 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f7cc5f48f-j8zf9"] Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.315815 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.315844 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91e68318-2de7-47b6-b2fd-c5932959f0ce-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.349187 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d98fd5798-8jhxf"] Dec 05 20:24:43 crc kubenswrapper[4885]: E1205 20:24:43.349571 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9be03938-1d91-45a5-beba-a54b318fc799" containerName="placement-db-sync" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.349589 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be03938-1d91-45a5-beba-a54b318fc799" containerName="placement-db-sync" Dec 05 20:24:43 crc kubenswrapper[4885]: E1205 20:24:43.349603 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e68318-2de7-47b6-b2fd-c5932959f0ce" containerName="init" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.349610 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e68318-2de7-47b6-b2fd-c5932959f0ce" containerName="init" Dec 05 20:24:43 crc kubenswrapper[4885]: E1205 20:24:43.349627 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e68318-2de7-47b6-b2fd-c5932959f0ce" containerName="dnsmasq-dns" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.349634 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e68318-2de7-47b6-b2fd-c5932959f0ce" containerName="dnsmasq-dns" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.349787 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9be03938-1d91-45a5-beba-a54b318fc799" containerName="placement-db-sync" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.349814 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e68318-2de7-47b6-b2fd-c5932959f0ce" containerName="dnsmasq-dns" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.350682 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.357325 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.357437 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.357495 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.357568 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ffgjd" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.357577 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.366468 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d98fd5798-8jhxf"] Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.413569 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7bdf6f4c4b-9n2vm"] Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.518765 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-scripts\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.518863 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-public-tls-certs\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.518893 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-internal-tls-certs\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.518955 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-config-data\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.518991 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-logs\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.519033 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-combined-ca-bundle\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.519080 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8xp4\" (UniqueName: \"kubernetes.io/projected/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-kube-api-access-g8xp4\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.620500 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-logs\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.620556 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-combined-ca-bundle\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.620621 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8xp4\" (UniqueName: \"kubernetes.io/projected/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-kube-api-access-g8xp4\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.620650 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-scripts\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.620716 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-public-tls-certs\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.620747 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-internal-tls-certs\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.620803 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-config-data\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.621517 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-logs\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.626384 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-combined-ca-bundle\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.627922 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-scripts\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.630326 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-config-data\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.632562 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-public-tls-certs\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.635650 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-internal-tls-certs\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.639717 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8xp4\" (UniqueName: \"kubernetes.io/projected/eca7ccc4-d1ff-402c-9fe8-0c61746d41d1-kube-api-access-g8xp4\") pod \"placement-d98fd5798-8jhxf\" (UID: \"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1\") " pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.677269 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.708279 4885 scope.go:117] "RemoveContainer" containerID="fd51330d85ce00dfe3d175aa4ae4469ade1ed81167092ac5336d226304e5bc22" Dec 05 20:24:43 crc kubenswrapper[4885]: W1205 20:24:43.712956 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8ffb925_d20c_4c24_a3b2_158d9c347b6b.slice/crio-41d2781b7394fb39c3edbc9202d35e6b116fc1ae246dd84664d601e8fa5438c1 WatchSource:0}: Error finding container 41d2781b7394fb39c3edbc9202d35e6b116fc1ae246dd84664d601e8fa5438c1: Status 404 returned error can't find the container with id 41d2781b7394fb39c3edbc9202d35e6b116fc1ae246dd84664d601e8fa5438c1 Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.857213 4885 scope.go:117] "RemoveContainer" containerID="6c984a17f8fe658b2b3580baa9a6eb11df2c0e26fc5c8155764c4378e2119e62" Dec 05 20:24:43 crc kubenswrapper[4885]: E1205 20:24:43.858847 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c984a17f8fe658b2b3580baa9a6eb11df2c0e26fc5c8155764c4378e2119e62\": container with ID starting with 6c984a17f8fe658b2b3580baa9a6eb11df2c0e26fc5c8155764c4378e2119e62 not found: ID does not exist" containerID="6c984a17f8fe658b2b3580baa9a6eb11df2c0e26fc5c8155764c4378e2119e62" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.858883 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c984a17f8fe658b2b3580baa9a6eb11df2c0e26fc5c8155764c4378e2119e62"} err="failed to get container status \"6c984a17f8fe658b2b3580baa9a6eb11df2c0e26fc5c8155764c4378e2119e62\": rpc error: code = NotFound desc = could not find container \"6c984a17f8fe658b2b3580baa9a6eb11df2c0e26fc5c8155764c4378e2119e62\": container with ID starting with 6c984a17f8fe658b2b3580baa9a6eb11df2c0e26fc5c8155764c4378e2119e62 not found: ID does not exist" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.858904 4885 scope.go:117] "RemoveContainer" containerID="fd51330d85ce00dfe3d175aa4ae4469ade1ed81167092ac5336d226304e5bc22" Dec 05 20:24:43 crc kubenswrapper[4885]: E1205 20:24:43.859293 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd51330d85ce00dfe3d175aa4ae4469ade1ed81167092ac5336d226304e5bc22\": container with ID starting with fd51330d85ce00dfe3d175aa4ae4469ade1ed81167092ac5336d226304e5bc22 not found: ID does not exist" containerID="fd51330d85ce00dfe3d175aa4ae4469ade1ed81167092ac5336d226304e5bc22" Dec 05 20:24:43 crc kubenswrapper[4885]: I1205 20:24:43.859318 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd51330d85ce00dfe3d175aa4ae4469ade1ed81167092ac5336d226304e5bc22"} err="failed to get container status \"fd51330d85ce00dfe3d175aa4ae4469ade1ed81167092ac5336d226304e5bc22\": rpc error: code = NotFound desc = could not find container \"fd51330d85ce00dfe3d175aa4ae4469ade1ed81167092ac5336d226304e5bc22\": container with ID starting with fd51330d85ce00dfe3d175aa4ae4469ade1ed81167092ac5336d226304e5bc22 not found: ID does not exist" Dec 05 20:24:44 crc kubenswrapper[4885]: I1205 20:24:44.217182 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d98fd5798-8jhxf"] Dec 05 20:24:44 crc kubenswrapper[4885]: I1205 20:24:44.276799 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d98fd5798-8jhxf" event={"ID":"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1","Type":"ContainerStarted","Data":"60ae923dabb5d5f9b7fca10544f77285dabb97bffee0c9a0f1e440a4b1f4fb45"} Dec 05 20:24:44 crc kubenswrapper[4885]: I1205 20:24:44.286761 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bdf6f4c4b-9n2vm" event={"ID":"a8ffb925-d20c-4c24-a3b2-158d9c347b6b","Type":"ContainerStarted","Data":"04e16cf9f25771aad7db8e6d049cda2459c15325d11c05890cb46eeb5d2296ca"} Dec 05 20:24:44 crc kubenswrapper[4885]: I1205 20:24:44.286799 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bdf6f4c4b-9n2vm" event={"ID":"a8ffb925-d20c-4c24-a3b2-158d9c347b6b","Type":"ContainerStarted","Data":"41d2781b7394fb39c3edbc9202d35e6b116fc1ae246dd84664d601e8fa5438c1"} Dec 05 20:24:44 crc kubenswrapper[4885]: I1205 20:24:44.287870 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:24:44 crc kubenswrapper[4885]: I1205 20:24:44.305274 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7bdf6f4c4b-9n2vm" podStartSLOduration=2.305253292 podStartE2EDuration="2.305253292s" podCreationTimestamp="2025-12-05 20:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:24:44.303045804 +0000 UTC m=+1149.599861455" watchObservedRunningTime="2025-12-05 20:24:44.305253292 +0000 UTC m=+1149.602068953" Dec 05 20:24:45 crc kubenswrapper[4885]: I1205 20:24:45.201370 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91e68318-2de7-47b6-b2fd-c5932959f0ce" path="/var/lib/kubelet/pods/91e68318-2de7-47b6-b2fd-c5932959f0ce/volumes" Dec 05 20:24:45 crc kubenswrapper[4885]: I1205 20:24:45.303917 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d98fd5798-8jhxf" event={"ID":"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1","Type":"ContainerStarted","Data":"ed7f55008a3c276d0386eb23c8d782a975aa4bbaf319543bbb694c0d8f3b309d"} Dec 05 20:24:45 crc kubenswrapper[4885]: I1205 20:24:45.303960 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d98fd5798-8jhxf" event={"ID":"eca7ccc4-d1ff-402c-9fe8-0c61746d41d1","Type":"ContainerStarted","Data":"2d4f50c4b8d77ecf6644d99d37522ca013400fe2861178a5bed1f471f39e068c"} Dec 05 20:24:45 crc kubenswrapper[4885]: I1205 20:24:45.303973 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:45 crc kubenswrapper[4885]: I1205 20:24:45.303983 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:24:45 crc kubenswrapper[4885]: I1205 20:24:45.334050 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d98fd5798-8jhxf" podStartSLOduration=2.334035331 podStartE2EDuration="2.334035331s" podCreationTimestamp="2025-12-05 20:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:24:45.329011536 +0000 UTC m=+1150.625827197" watchObservedRunningTime="2025-12-05 20:24:45.334035331 +0000 UTC m=+1150.630850992" Dec 05 20:24:46 crc kubenswrapper[4885]: I1205 20:24:46.313128 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6jq57" event={"ID":"e4a908e8-64e1-4fec-b455-66527f7efee3","Type":"ContainerStarted","Data":"b8ff479621e3db136b46b8e45f013a9e4ae7973dde8e3205e8fda0e34ba387b2"} Dec 05 20:24:46 crc kubenswrapper[4885]: I1205 20:24:46.336365 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-6jq57" podStartSLOduration=3.851100548 podStartE2EDuration="49.336342536s" podCreationTimestamp="2025-12-05 20:23:57 +0000 UTC" firstStartedPulling="2025-12-05 20:23:59.194695203 +0000 UTC m=+1104.491510854" lastFinishedPulling="2025-12-05 20:24:44.679937181 +0000 UTC m=+1149.976752842" observedRunningTime="2025-12-05 20:24:46.332406506 +0000 UTC m=+1151.629222167" watchObservedRunningTime="2025-12-05 20:24:46.336342536 +0000 UTC m=+1151.633158197" Dec 05 20:24:46 crc kubenswrapper[4885]: I1205 20:24:46.632252 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:24:46 crc kubenswrapper[4885]: I1205 20:24:46.632558 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:24:46 crc kubenswrapper[4885]: I1205 20:24:46.632599 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:24:46 crc kubenswrapper[4885]: I1205 20:24:46.633572 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7059cc5d928871aedc23182a22e9ba744742e5284851e631b5de955d05b94f8c"} pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:24:46 crc kubenswrapper[4885]: I1205 20:24:46.633664 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" containerID="cri-o://7059cc5d928871aedc23182a22e9ba744742e5284851e631b5de955d05b94f8c" gracePeriod=600 Dec 05 20:24:46 crc kubenswrapper[4885]: I1205 20:24:46.875778 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7ddb869454-vvfd9" podUID="58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 05 20:24:47 crc kubenswrapper[4885]: I1205 20:24:47.093154 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7d9999949d-c22ch" podUID="d0f84b71-1907-4f71-833d-1e5561a4f0f8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 05 20:24:47 crc kubenswrapper[4885]: I1205 20:24:47.337336 4885 generic.go:334] "Generic (PLEG): container finished" podID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerID="7059cc5d928871aedc23182a22e9ba744742e5284851e631b5de955d05b94f8c" exitCode=0 Dec 05 20:24:47 crc kubenswrapper[4885]: I1205 20:24:47.337396 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerDied","Data":"7059cc5d928871aedc23182a22e9ba744742e5284851e631b5de955d05b94f8c"} Dec 05 20:24:47 crc kubenswrapper[4885]: I1205 20:24:47.337428 4885 scope.go:117] "RemoveContainer" containerID="838d57f53907a18978ccf285771525c5f73a2f0a8cab487f678fbc79c5b8663f" Dec 05 20:24:47 crc kubenswrapper[4885]: I1205 20:24:47.339743 4885 generic.go:334] "Generic (PLEG): container finished" podID="af42085d-f7f5-4dd5-86d1-7019ba4d0888" containerID="733b7abb9b6783fa9892ee608b15a56dccdccc24196a35763797acfd3fe31d85" exitCode=0 Dec 05 20:24:47 crc kubenswrapper[4885]: I1205 20:24:47.339782 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dsgxp" event={"ID":"af42085d-f7f5-4dd5-86d1-7019ba4d0888","Type":"ContainerDied","Data":"733b7abb9b6783fa9892ee608b15a56dccdccc24196a35763797acfd3fe31d85"} Dec 05 20:24:49 crc kubenswrapper[4885]: I1205 20:24:49.362230 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dsgxp" event={"ID":"af42085d-f7f5-4dd5-86d1-7019ba4d0888","Type":"ContainerDied","Data":"58141926f1a496953a02fd8d1d9e6a765f0192215c9c05830813496dd73d5f1f"} Dec 05 20:24:49 crc kubenswrapper[4885]: I1205 20:24:49.362753 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58141926f1a496953a02fd8d1d9e6a765f0192215c9c05830813496dd73d5f1f" Dec 05 20:24:49 crc kubenswrapper[4885]: I1205 20:24:49.371990 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dsgxp" Dec 05 20:24:49 crc kubenswrapper[4885]: I1205 20:24:49.547150 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g76pq\" (UniqueName: \"kubernetes.io/projected/af42085d-f7f5-4dd5-86d1-7019ba4d0888-kube-api-access-g76pq\") pod \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\" (UID: \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\") " Dec 05 20:24:49 crc kubenswrapper[4885]: I1205 20:24:49.547262 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-config-data\") pod \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\" (UID: \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\") " Dec 05 20:24:49 crc kubenswrapper[4885]: I1205 20:24:49.547296 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-combined-ca-bundle\") pod \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\" (UID: \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\") " Dec 05 20:24:49 crc kubenswrapper[4885]: I1205 20:24:49.547370 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-db-sync-config-data\") pod \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\" (UID: \"af42085d-f7f5-4dd5-86d1-7019ba4d0888\") " Dec 05 20:24:49 crc kubenswrapper[4885]: I1205 20:24:49.552310 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "af42085d-f7f5-4dd5-86d1-7019ba4d0888" (UID: "af42085d-f7f5-4dd5-86d1-7019ba4d0888"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:49 crc kubenswrapper[4885]: I1205 20:24:49.552592 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af42085d-f7f5-4dd5-86d1-7019ba4d0888-kube-api-access-g76pq" (OuterVolumeSpecName: "kube-api-access-g76pq") pod "af42085d-f7f5-4dd5-86d1-7019ba4d0888" (UID: "af42085d-f7f5-4dd5-86d1-7019ba4d0888"). InnerVolumeSpecName "kube-api-access-g76pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:49 crc kubenswrapper[4885]: I1205 20:24:49.604750 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-config-data" (OuterVolumeSpecName: "config-data") pod "af42085d-f7f5-4dd5-86d1-7019ba4d0888" (UID: "af42085d-f7f5-4dd5-86d1-7019ba4d0888"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:49 crc kubenswrapper[4885]: I1205 20:24:49.613552 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af42085d-f7f5-4dd5-86d1-7019ba4d0888" (UID: "af42085d-f7f5-4dd5-86d1-7019ba4d0888"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:49 crc kubenswrapper[4885]: I1205 20:24:49.652123 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:49 crc kubenswrapper[4885]: I1205 20:24:49.652164 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:49 crc kubenswrapper[4885]: I1205 20:24:49.652174 4885 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af42085d-f7f5-4dd5-86d1-7019ba4d0888-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:49 crc kubenswrapper[4885]: I1205 20:24:49.652183 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g76pq\" (UniqueName: \"kubernetes.io/projected/af42085d-f7f5-4dd5-86d1-7019ba4d0888-kube-api-access-g76pq\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:50 crc kubenswrapper[4885]: I1205 20:24:50.374844 4885 generic.go:334] "Generic (PLEG): container finished" podID="e4a908e8-64e1-4fec-b455-66527f7efee3" containerID="b8ff479621e3db136b46b8e45f013a9e4ae7973dde8e3205e8fda0e34ba387b2" exitCode=0 Dec 05 20:24:50 crc kubenswrapper[4885]: I1205 20:24:50.374919 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6jq57" event={"ID":"e4a908e8-64e1-4fec-b455-66527f7efee3","Type":"ContainerDied","Data":"b8ff479621e3db136b46b8e45f013a9e4ae7973dde8e3205e8fda0e34ba387b2"} Dec 05 20:24:50 crc kubenswrapper[4885]: I1205 20:24:50.378307 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dsgxp" Dec 05 20:24:50 crc kubenswrapper[4885]: I1205 20:24:50.840839 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c874f55b9-2pvnq"] Dec 05 20:24:50 crc kubenswrapper[4885]: E1205 20:24:50.841465 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af42085d-f7f5-4dd5-86d1-7019ba4d0888" containerName="glance-db-sync" Dec 05 20:24:50 crc kubenswrapper[4885]: I1205 20:24:50.841480 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="af42085d-f7f5-4dd5-86d1-7019ba4d0888" containerName="glance-db-sync" Dec 05 20:24:50 crc kubenswrapper[4885]: I1205 20:24:50.841656 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="af42085d-f7f5-4dd5-86d1-7019ba4d0888" containerName="glance-db-sync" Dec 05 20:24:50 crc kubenswrapper[4885]: I1205 20:24:50.842538 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:50 crc kubenswrapper[4885]: I1205 20:24:50.866293 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c874f55b9-2pvnq"] Dec 05 20:24:50 crc kubenswrapper[4885]: I1205 20:24:50.984604 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-dns-svc\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:50 crc kubenswrapper[4885]: I1205 20:24:50.984654 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-ovsdbserver-nb\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:50 crc kubenswrapper[4885]: I1205 20:24:50.984698 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-ovsdbserver-sb\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:50 crc kubenswrapper[4885]: I1205 20:24:50.984721 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-dns-swift-storage-0\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:50 crc kubenswrapper[4885]: I1205 20:24:50.984807 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k58zd\" (UniqueName: \"kubernetes.io/projected/77725a9f-be2f-4853-bffa-7087d08a6e89-kube-api-access-k58zd\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:50 crc kubenswrapper[4885]: I1205 20:24:50.984996 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-config\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:51 crc kubenswrapper[4885]: E1205 20:24:51.083971 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"ceilometer-notification-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="a91533ae-4113-4680-8fb9-c0a3fa74daa8" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.086874 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-dns-svc\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.086923 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-ovsdbserver-nb\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.086964 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-ovsdbserver-sb\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.086989 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-dns-swift-storage-0\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.087012 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k58zd\" (UniqueName: \"kubernetes.io/projected/77725a9f-be2f-4853-bffa-7087d08a6e89-kube-api-access-k58zd\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.087069 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-config\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.087817 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-dns-svc\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.087847 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-config\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.088077 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-dns-swift-storage-0\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.088176 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-ovsdbserver-nb\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.088764 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-ovsdbserver-sb\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.106334 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k58zd\" (UniqueName: \"kubernetes.io/projected/77725a9f-be2f-4853-bffa-7087d08a6e89-kube-api-access-k58zd\") pod \"dnsmasq-dns-7c874f55b9-2pvnq\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.212319 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.418645 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5szt6" event={"ID":"88521675-6180-4a17-ba7d-6bb9eb07e7dd","Type":"ContainerStarted","Data":"499638b80cfea95c0f85dd0b05050fd6e7a749f7329e84bb4c6d967622a75e2b"} Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.420883 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a91533ae-4113-4680-8fb9-c0a3fa74daa8","Type":"ContainerStarted","Data":"59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa"} Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.421027 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.421058 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a91533ae-4113-4680-8fb9-c0a3fa74daa8" containerName="proxy-httpd" containerID="cri-o://59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa" gracePeriod=30 Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.421006 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a91533ae-4113-4680-8fb9-c0a3fa74daa8" containerName="sg-core" containerID="cri-o://fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6" gracePeriod=30 Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.445819 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"91c26cde9f44964206a15bb12fc6d413d79858501fac35b74853db9d5b02ba34"} Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.448042 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-5szt6" podStartSLOduration=1.963507522 podStartE2EDuration="53.448013302s" podCreationTimestamp="2025-12-05 20:23:58 +0000 UTC" firstStartedPulling="2025-12-05 20:23:59.297425773 +0000 UTC m=+1104.594241434" lastFinishedPulling="2025-12-05 20:24:50.781931553 +0000 UTC m=+1156.078747214" observedRunningTime="2025-12-05 20:24:51.442505602 +0000 UTC m=+1156.739321263" watchObservedRunningTime="2025-12-05 20:24:51.448013302 +0000 UTC m=+1156.744828963" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.710338 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.714321 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.745239 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lcfzg" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.745375 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.756810 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.766272 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.850487 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c874f55b9-2pvnq"] Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.901892 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.902208 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-config-data\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.902242 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.902257 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-scripts\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.902305 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r426h\" (UniqueName: \"kubernetes.io/projected/0c75814a-fdba-4523-9f63-1859ada5601d-kube-api-access-r426h\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.902319 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c75814a-fdba-4523-9f63-1859ada5601d-logs\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:51 crc kubenswrapper[4885]: I1205 20:24:51.902376 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c75814a-fdba-4523-9f63-1859ada5601d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.004675 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.004789 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-scripts\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.004830 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r426h\" (UniqueName: \"kubernetes.io/projected/0c75814a-fdba-4523-9f63-1859ada5601d-kube-api-access-r426h\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.004847 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c75814a-fdba-4523-9f63-1859ada5601d-logs\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.004876 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c75814a-fdba-4523-9f63-1859ada5601d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.004929 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.005166 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-config-data\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.005404 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.005524 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c75814a-fdba-4523-9f63-1859ada5601d-logs\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.009292 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c75814a-fdba-4523-9f63-1859ada5601d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.009583 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.009592 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-scripts\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.010876 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-config-data\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.046909 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r426h\" (UniqueName: \"kubernetes.io/projected/0c75814a-fdba-4523-9f63-1859ada5601d-kube-api-access-r426h\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.062854 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.064096 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6jq57" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.072351 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.088037 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.099111 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:24:52 crc kubenswrapper[4885]: E1205 20:24:52.099548 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91533ae-4113-4680-8fb9-c0a3fa74daa8" containerName="proxy-httpd" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.099562 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91533ae-4113-4680-8fb9-c0a3fa74daa8" containerName="proxy-httpd" Dec 05 20:24:52 crc kubenswrapper[4885]: E1205 20:24:52.099576 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91533ae-4113-4680-8fb9-c0a3fa74daa8" containerName="sg-core" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.099583 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91533ae-4113-4680-8fb9-c0a3fa74daa8" containerName="sg-core" Dec 05 20:24:52 crc kubenswrapper[4885]: E1205 20:24:52.099610 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a908e8-64e1-4fec-b455-66527f7efee3" containerName="cinder-db-sync" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.099618 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a908e8-64e1-4fec-b455-66527f7efee3" containerName="cinder-db-sync" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.099830 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a908e8-64e1-4fec-b455-66527f7efee3" containerName="cinder-db-sync" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.099860 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a91533ae-4113-4680-8fb9-c0a3fa74daa8" containerName="sg-core" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.099875 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a91533ae-4113-4680-8fb9-c0a3fa74daa8" containerName="proxy-httpd" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.100926 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.103622 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.130795 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.172237 4885 scope.go:117] "RemoveContainer" containerID="0c8c1c70c31d9755459ef8b4e3697cb4709f996f244e81647bc735f33560ce0a" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.208751 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-config-data\") pod \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209130 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-scripts\") pod \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209175 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-scripts\") pod \"e4a908e8-64e1-4fec-b455-66527f7efee3\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209227 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a91533ae-4113-4680-8fb9-c0a3fa74daa8-log-httpd\") pod \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209262 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a91533ae-4113-4680-8fb9-c0a3fa74daa8-run-httpd\") pod \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209297 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzwlx\" (UniqueName: \"kubernetes.io/projected/e4a908e8-64e1-4fec-b455-66527f7efee3-kube-api-access-wzwlx\") pod \"e4a908e8-64e1-4fec-b455-66527f7efee3\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209323 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-config-data\") pod \"e4a908e8-64e1-4fec-b455-66527f7efee3\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209351 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4a908e8-64e1-4fec-b455-66527f7efee3-etc-machine-id\") pod \"e4a908e8-64e1-4fec-b455-66527f7efee3\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209386 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-db-sync-config-data\") pod \"e4a908e8-64e1-4fec-b455-66527f7efee3\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209401 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-combined-ca-bundle\") pod \"e4a908e8-64e1-4fec-b455-66527f7efee3\" (UID: \"e4a908e8-64e1-4fec-b455-66527f7efee3\") " Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209420 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-sg-core-conf-yaml\") pod \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209444 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-combined-ca-bundle\") pod \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209474 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tm99\" (UniqueName: \"kubernetes.io/projected/a91533ae-4113-4680-8fb9-c0a3fa74daa8-kube-api-access-6tm99\") pod \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\" (UID: \"a91533ae-4113-4680-8fb9-c0a3fa74daa8\") " Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209696 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-scripts\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209726 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70ad27ce-e57b-4dc9-a8c6-95edbd158105-logs\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209746 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209767 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-config-data\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209810 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209829 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6fqj\" (UniqueName: \"kubernetes.io/projected/70ad27ce-e57b-4dc9-a8c6-95edbd158105-kube-api-access-s6fqj\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.209844 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70ad27ce-e57b-4dc9-a8c6-95edbd158105-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.233335 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4a908e8-64e1-4fec-b455-66527f7efee3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e4a908e8-64e1-4fec-b455-66527f7efee3" (UID: "e4a908e8-64e1-4fec-b455-66527f7efee3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.234627 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a91533ae-4113-4680-8fb9-c0a3fa74daa8-kube-api-access-6tm99" (OuterVolumeSpecName: "kube-api-access-6tm99") pod "a91533ae-4113-4680-8fb9-c0a3fa74daa8" (UID: "a91533ae-4113-4680-8fb9-c0a3fa74daa8"). InnerVolumeSpecName "kube-api-access-6tm99". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.236375 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a908e8-64e1-4fec-b455-66527f7efee3-kube-api-access-wzwlx" (OuterVolumeSpecName: "kube-api-access-wzwlx") pod "e4a908e8-64e1-4fec-b455-66527f7efee3" (UID: "e4a908e8-64e1-4fec-b455-66527f7efee3"). InnerVolumeSpecName "kube-api-access-wzwlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.236494 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e4a908e8-64e1-4fec-b455-66527f7efee3" (UID: "e4a908e8-64e1-4fec-b455-66527f7efee3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.239092 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a91533ae-4113-4680-8fb9-c0a3fa74daa8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a91533ae-4113-4680-8fb9-c0a3fa74daa8" (UID: "a91533ae-4113-4680-8fb9-c0a3fa74daa8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.240285 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a91533ae-4113-4680-8fb9-c0a3fa74daa8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a91533ae-4113-4680-8fb9-c0a3fa74daa8" (UID: "a91533ae-4113-4680-8fb9-c0a3fa74daa8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.240576 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-scripts" (OuterVolumeSpecName: "scripts") pod "a91533ae-4113-4680-8fb9-c0a3fa74daa8" (UID: "a91533ae-4113-4680-8fb9-c0a3fa74daa8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.272167 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-scripts" (OuterVolumeSpecName: "scripts") pod "e4a908e8-64e1-4fec-b455-66527f7efee3" (UID: "e4a908e8-64e1-4fec-b455-66527f7efee3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.277498 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4a908e8-64e1-4fec-b455-66527f7efee3" (UID: "e4a908e8-64e1-4fec-b455-66527f7efee3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.313265 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.313792 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.314161 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6fqj\" (UniqueName: \"kubernetes.io/projected/70ad27ce-e57b-4dc9-a8c6-95edbd158105-kube-api-access-s6fqj\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.314199 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70ad27ce-e57b-4dc9-a8c6-95edbd158105-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.314379 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-scripts\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.314450 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70ad27ce-e57b-4dc9-a8c6-95edbd158105-logs\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.314472 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.314502 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-config-data\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.314578 4885 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.314593 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.314603 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tm99\" (UniqueName: \"kubernetes.io/projected/a91533ae-4113-4680-8fb9-c0a3fa74daa8-kube-api-access-6tm99\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.314616 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.314626 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.314636 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a91533ae-4113-4680-8fb9-c0a3fa74daa8-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.314648 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a91533ae-4113-4680-8fb9-c0a3fa74daa8-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.314657 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzwlx\" (UniqueName: \"kubernetes.io/projected/e4a908e8-64e1-4fec-b455-66527f7efee3-kube-api-access-wzwlx\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.314666 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4a908e8-64e1-4fec-b455-66527f7efee3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.315188 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70ad27ce-e57b-4dc9-a8c6-95edbd158105-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.315555 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70ad27ce-e57b-4dc9-a8c6-95edbd158105-logs\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.319879 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a91533ae-4113-4680-8fb9-c0a3fa74daa8" (UID: "a91533ae-4113-4680-8fb9-c0a3fa74daa8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.324203 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-config-data\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.326357 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.341749 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-scripts\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.348351 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6fqj\" (UniqueName: \"kubernetes.io/projected/70ad27ce-e57b-4dc9-a8c6-95edbd158105-kube-api-access-s6fqj\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.354935 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a91533ae-4113-4680-8fb9-c0a3fa74daa8" (UID: "a91533ae-4113-4680-8fb9-c0a3fa74daa8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.371782 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-config-data" (OuterVolumeSpecName: "config-data") pod "e4a908e8-64e1-4fec-b455-66527f7efee3" (UID: "e4a908e8-64e1-4fec-b455-66527f7efee3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.378661 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-config-data" (OuterVolumeSpecName: "config-data") pod "a91533ae-4113-4680-8fb9-c0a3fa74daa8" (UID: "a91533ae-4113-4680-8fb9-c0a3fa74daa8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.399390 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.416625 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a908e8-64e1-4fec-b455-66527f7efee3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.416662 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.416672 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.416681 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a91533ae-4113-4680-8fb9-c0a3fa74daa8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.424412 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.464602 4885 generic.go:334] "Generic (PLEG): container finished" podID="77725a9f-be2f-4853-bffa-7087d08a6e89" containerID="014a02a8608793570cec1d1ada37af2086f5397a323c6bd46dd1f7c69b1a6682" exitCode=0 Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.464889 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" event={"ID":"77725a9f-be2f-4853-bffa-7087d08a6e89","Type":"ContainerDied","Data":"014a02a8608793570cec1d1ada37af2086f5397a323c6bd46dd1f7c69b1a6682"} Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.465112 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" event={"ID":"77725a9f-be2f-4853-bffa-7087d08a6e89","Type":"ContainerStarted","Data":"89b61dd6234e3c110fdaac565ebbdd4b0409f044f0e83c504a4a576cb19faf98"} Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.472929 4885 generic.go:334] "Generic (PLEG): container finished" podID="a91533ae-4113-4680-8fb9-c0a3fa74daa8" containerID="59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa" exitCode=0 Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.472955 4885 generic.go:334] "Generic (PLEG): container finished" podID="a91533ae-4113-4680-8fb9-c0a3fa74daa8" containerID="fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6" exitCode=2 Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.472996 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a91533ae-4113-4680-8fb9-c0a3fa74daa8","Type":"ContainerDied","Data":"59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa"} Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.473038 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a91533ae-4113-4680-8fb9-c0a3fa74daa8","Type":"ContainerDied","Data":"fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6"} Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.473049 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a91533ae-4113-4680-8fb9-c0a3fa74daa8","Type":"ContainerDied","Data":"ee8b99d58db7f40769260f0ce89044e6f0dc08f15156fbf6b64831c02b2c6be8"} Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.473063 4885 scope.go:117] "RemoveContainer" containerID="59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.473189 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.491919 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6jq57" event={"ID":"e4a908e8-64e1-4fec-b455-66527f7efee3","Type":"ContainerDied","Data":"62d5152b38b24efd465c19778bdbdfe92568643ed2370eeb4ffcd6ebed6a4214"} Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.491961 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62d5152b38b24efd465c19778bdbdfe92568643ed2370eeb4ffcd6ebed6a4214" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.492661 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6jq57" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.525190 4885 scope.go:117] "RemoveContainer" containerID="fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.545998 4885 scope.go:117] "RemoveContainer" containerID="59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa" Dec 05 20:24:52 crc kubenswrapper[4885]: E1205 20:24:52.550124 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa\": container with ID starting with 59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa not found: ID does not exist" containerID="59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.550312 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa"} err="failed to get container status \"59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa\": rpc error: code = NotFound desc = could not find container \"59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa\": container with ID starting with 59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa not found: ID does not exist" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.550438 4885 scope.go:117] "RemoveContainer" containerID="fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6" Dec 05 20:24:52 crc kubenswrapper[4885]: E1205 20:24:52.558227 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6\": container with ID starting with fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6 not found: ID does not exist" containerID="fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.558268 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6"} err="failed to get container status \"fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6\": rpc error: code = NotFound desc = could not find container \"fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6\": container with ID starting with fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6 not found: ID does not exist" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.558296 4885 scope.go:117] "RemoveContainer" containerID="59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.558932 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa"} err="failed to get container status \"59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa\": rpc error: code = NotFound desc = could not find container \"59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa\": container with ID starting with 59c3afa3a474b675732270fb712ca5c51c022bab8fb20b0ba5a79dfc38705daa not found: ID does not exist" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.558973 4885 scope.go:117] "RemoveContainer" containerID="fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.559489 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6"} err="failed to get container status \"fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6\": rpc error: code = NotFound desc = could not find container \"fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6\": container with ID starting with fb20cd859c6eb1cd232842865e9303bc0a6a7e2ba51ad7becc488eb0e433ffa6 not found: ID does not exist" Dec 05 20:24:52 crc kubenswrapper[4885]: E1205 20:24:52.563448 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77725a9f_be2f_4853_bffa_7087d08a6e89.slice/crio-conmon-014a02a8608793570cec1d1ada37af2086f5397a323c6bd46dd1f7c69b1a6682.scope\": RecentStats: unable to find data in memory cache]" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.635176 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.671175 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.686952 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.688882 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.693944 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.694937 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.701366 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.769584 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/962a5840-991a-4f47-960f-b75f1bc33fa8-run-httpd\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.769664 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/962a5840-991a-4f47-960f-b75f1bc33fa8-log-httpd\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.769706 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-scripts\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.769749 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.769774 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-config-data\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.769799 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.769822 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44cq9\" (UniqueName: \"kubernetes.io/projected/962a5840-991a-4f47-960f-b75f1bc33fa8-kube-api-access-44cq9\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.797621 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.800945 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.842238 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.842622 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hbpgp" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.842754 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.843054 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.896608 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-scripts\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.896912 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.904641 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.904985 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-config-data\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.905059 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.905110 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44cq9\" (UniqueName: \"kubernetes.io/projected/962a5840-991a-4f47-960f-b75f1bc33fa8-kube-api-access-44cq9\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.905310 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.905332 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-config-data\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.905366 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/962a5840-991a-4f47-960f-b75f1bc33fa8-run-httpd\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.905391 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.905427 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g96nt\" (UniqueName: \"kubernetes.io/projected/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-kube-api-access-g96nt\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.905486 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/962a5840-991a-4f47-960f-b75f1bc33fa8-log-httpd\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.905544 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.905589 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-scripts\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.907091 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/962a5840-991a-4f47-960f-b75f1bc33fa8-run-httpd\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.907449 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/962a5840-991a-4f47-960f-b75f1bc33fa8-log-httpd\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.911800 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.921321 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-config-data\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.927288 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.929664 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-scripts\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.942122 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44cq9\" (UniqueName: \"kubernetes.io/projected/962a5840-991a-4f47-960f-b75f1bc33fa8-kube-api-access-44cq9\") pod \"ceilometer-0\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " pod="openstack/ceilometer-0" Dec 05 20:24:52 crc kubenswrapper[4885]: I1205 20:24:52.977940 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c874f55b9-2pvnq"] Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.006929 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-scripts\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.007049 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.007065 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-config-data\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.007085 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.007115 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g96nt\" (UniqueName: \"kubernetes.io/projected/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-kube-api-access-g96nt\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.007170 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.012142 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.012848 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.012864 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.013212 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-scripts\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.015963 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-config-data\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.027663 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.043266 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g96nt\" (UniqueName: \"kubernetes.io/projected/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-kube-api-access-g96nt\") pod \"cinder-scheduler-0\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " pod="openstack/cinder-scheduler-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.073508 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.100770 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d6cb77c59-k7mtf"] Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.102118 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.124061 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d6cb77c59-k7mtf"] Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.157586 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.159310 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.162144 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.203012 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a91533ae-4113-4680-8fb9-c0a3fa74daa8" path="/var/lib/kubelet/pods/a91533ae-4113-4680-8fb9-c0a3fa74daa8/volumes" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.203655 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.212724 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-dns-svc\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.212813 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a9e9ea-8961-4525-9781-e66e829d1f13-logs\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.212840 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-config-data-custom\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.212857 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4a9e9ea-8961-4525-9781-e66e829d1f13-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.212900 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w7hp\" (UniqueName: \"kubernetes.io/projected/a4a9e9ea-8961-4525-9781-e66e829d1f13-kube-api-access-8w7hp\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.212916 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-dns-swift-storage-0\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.212967 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8956\" (UniqueName: \"kubernetes.io/projected/b800e0bf-89cc-47a6-8984-63b35e27d593-kube-api-access-g8956\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.212992 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-ovsdbserver-sb\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.213047 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-config\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.213133 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-scripts\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.213243 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-config-data\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.213346 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.213407 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-ovsdbserver-nb\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.280316 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.321861 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a9e9ea-8961-4525-9781-e66e829d1f13-logs\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.321932 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-config-data-custom\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.322742 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a9e9ea-8961-4525-9781-e66e829d1f13-logs\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.322813 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4a9e9ea-8961-4525-9781-e66e829d1f13-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.322866 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-dns-swift-storage-0\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.322962 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4a9e9ea-8961-4525-9781-e66e829d1f13-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.323060 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w7hp\" (UniqueName: \"kubernetes.io/projected/a4a9e9ea-8961-4525-9781-e66e829d1f13-kube-api-access-8w7hp\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.323536 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8956\" (UniqueName: \"kubernetes.io/projected/b800e0bf-89cc-47a6-8984-63b35e27d593-kube-api-access-g8956\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.323584 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-ovsdbserver-sb\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.323734 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-config\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.323949 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-dns-swift-storage-0\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.324244 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-scripts\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.324286 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-config-data\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.324377 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.324443 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-ovsdbserver-nb\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.324558 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-dns-svc\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.324978 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-ovsdbserver-sb\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.325038 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.327183 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-ovsdbserver-nb\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.328264 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-config\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.332335 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-scripts\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.332874 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-config-data-custom\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.333541 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-dns-svc\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.334751 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-config-data\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.344456 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.345978 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8956\" (UniqueName: \"kubernetes.io/projected/b800e0bf-89cc-47a6-8984-63b35e27d593-kube-api-access-g8956\") pod \"dnsmasq-dns-d6cb77c59-k7mtf\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.356331 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w7hp\" (UniqueName: \"kubernetes.io/projected/a4a9e9ea-8961-4525-9781-e66e829d1f13-kube-api-access-8w7hp\") pod \"cinder-api-0\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.462509 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.484150 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.559247 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dd697974b-njsvr_2037cb2f-46ad-4a89-b430-91dd3568954f/neutron-httpd/2.log" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.561161 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dd697974b-njsvr_2037cb2f-46ad-4a89-b430-91dd3568954f/neutron-httpd/1.log" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.562226 4885 generic.go:334] "Generic (PLEG): container finished" podID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerID="887edbc799e9e64a2fcfe7e14853a2d60577d8dd493d4919b225740db949ad6f" exitCode=1 Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.562336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dd697974b-njsvr" event={"ID":"2037cb2f-46ad-4a89-b430-91dd3568954f","Type":"ContainerDied","Data":"887edbc799e9e64a2fcfe7e14853a2d60577d8dd493d4919b225740db949ad6f"} Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.562394 4885 scope.go:117] "RemoveContainer" containerID="0c8c1c70c31d9755459ef8b4e3697cb4709f996f244e81647bc735f33560ce0a" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.563382 4885 scope.go:117] "RemoveContainer" containerID="887edbc799e9e64a2fcfe7e14853a2d60577d8dd493d4919b225740db949ad6f" Dec 05 20:24:53 crc kubenswrapper[4885]: E1205 20:24:53.563625 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-6dd697974b-njsvr_openstack(2037cb2f-46ad-4a89-b430-91dd3568954f)\"" pod="openstack/neutron-6dd697974b-njsvr" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.579650 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c75814a-fdba-4523-9f63-1859ada5601d","Type":"ContainerStarted","Data":"46c89aacd30cd723e28c64fbe394efd7c5170398579fd0a60ad2c69e98830d4e"} Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.630093 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"70ad27ce-e57b-4dc9-a8c6-95edbd158105","Type":"ContainerStarted","Data":"f468ad4a4332462d450398089213aa5ebb5e7f9325e0a3a717fa58de0ea07fb3"} Dec 05 20:24:53 crc kubenswrapper[4885]: I1205 20:24:53.856284 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:24:54 crc kubenswrapper[4885]: I1205 20:24:54.197395 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:24:54 crc kubenswrapper[4885]: I1205 20:24:54.269436 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:24:54 crc kubenswrapper[4885]: I1205 20:24:54.415510 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d6cb77c59-k7mtf"] Dec 05 20:24:54 crc kubenswrapper[4885]: W1205 20:24:54.438915 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb800e0bf_89cc_47a6_8984_63b35e27d593.slice/crio-18b9fa15912b5e9c2c564d5e8dfadccad08023f82b9a555ccb7936c26d6589d7 WatchSource:0}: Error finding container 18b9fa15912b5e9c2c564d5e8dfadccad08023f82b9a555ccb7936c26d6589d7: Status 404 returned error can't find the container with id 18b9fa15912b5e9c2c564d5e8dfadccad08023f82b9a555ccb7936c26d6589d7 Dec 05 20:24:54 crc kubenswrapper[4885]: I1205 20:24:54.570638 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:24:54 crc kubenswrapper[4885]: I1205 20:24:54.651159 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"962a5840-991a-4f47-960f-b75f1bc33fa8","Type":"ContainerStarted","Data":"64141964f111aee59ad5027cc88e28904f16af8a8b3e6f9b067cfef82ea32041"} Dec 05 20:24:54 crc kubenswrapper[4885]: I1205 20:24:54.659734 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:24:54 crc kubenswrapper[4885]: I1205 20:24:54.660652 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" event={"ID":"b800e0bf-89cc-47a6-8984-63b35e27d593","Type":"ContainerStarted","Data":"18b9fa15912b5e9c2c564d5e8dfadccad08023f82b9a555ccb7936c26d6589d7"} Dec 05 20:24:54 crc kubenswrapper[4885]: I1205 20:24:54.678505 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"70ad27ce-e57b-4dc9-a8c6-95edbd158105","Type":"ContainerStarted","Data":"ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd"} Dec 05 20:24:54 crc kubenswrapper[4885]: I1205 20:24:54.728422 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"46e5a3b1-b389-45b0-a539-7197ce0b9b4e","Type":"ContainerStarted","Data":"54b477bee180673cd7896114bc320c2eb91386652ac31f8a3759127983228a13"} Dec 05 20:24:54 crc kubenswrapper[4885]: I1205 20:24:54.732578 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a9e9ea-8961-4525-9781-e66e829d1f13","Type":"ContainerStarted","Data":"e42c28b1c161fd6089547ebbe6e46c73f9484f28c4b904614326fe401ebdd4f5"} Dec 05 20:24:54 crc kubenswrapper[4885]: I1205 20:24:54.738164 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dd697974b-njsvr_2037cb2f-46ad-4a89-b430-91dd3568954f/neutron-httpd/2.log" Dec 05 20:24:54 crc kubenswrapper[4885]: I1205 20:24:54.740370 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c75814a-fdba-4523-9f63-1859ada5601d","Type":"ContainerStarted","Data":"6ffa8c97a03eca7a23b606c7a59000c732dd8c61f2e987a29cfde1d7f34d8ac5"} Dec 05 20:24:54 crc kubenswrapper[4885]: I1205 20:24:54.747971 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" event={"ID":"77725a9f-be2f-4853-bffa-7087d08a6e89","Type":"ContainerStarted","Data":"166dc84c795f9be405d4355a07ac1bc8620578b8c9df7c8ffb23e578e964452e"} Dec 05 20:24:54 crc kubenswrapper[4885]: I1205 20:24:54.760252 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" podUID="77725a9f-be2f-4853-bffa-7087d08a6e89" containerName="dnsmasq-dns" containerID="cri-o://166dc84c795f9be405d4355a07ac1bc8620578b8c9df7c8ffb23e578e964452e" gracePeriod=10 Dec 05 20:24:54 crc kubenswrapper[4885]: I1205 20:24:54.760997 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:54 crc kubenswrapper[4885]: I1205 20:24:54.823790 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" podStartSLOduration=4.823767738 podStartE2EDuration="4.823767738s" podCreationTimestamp="2025-12-05 20:24:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:24:54.800593995 +0000 UTC m=+1160.097409656" watchObservedRunningTime="2025-12-05 20:24:54.823767738 +0000 UTC m=+1160.120583399" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.495736 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.628615 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k58zd\" (UniqueName: \"kubernetes.io/projected/77725a9f-be2f-4853-bffa-7087d08a6e89-kube-api-access-k58zd\") pod \"77725a9f-be2f-4853-bffa-7087d08a6e89\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.629073 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-ovsdbserver-nb\") pod \"77725a9f-be2f-4853-bffa-7087d08a6e89\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.629166 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-dns-svc\") pod \"77725a9f-be2f-4853-bffa-7087d08a6e89\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.629253 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-config\") pod \"77725a9f-be2f-4853-bffa-7087d08a6e89\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.629362 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-ovsdbserver-sb\") pod \"77725a9f-be2f-4853-bffa-7087d08a6e89\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.629551 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-dns-swift-storage-0\") pod \"77725a9f-be2f-4853-bffa-7087d08a6e89\" (UID: \"77725a9f-be2f-4853-bffa-7087d08a6e89\") " Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.638330 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77725a9f-be2f-4853-bffa-7087d08a6e89-kube-api-access-k58zd" (OuterVolumeSpecName: "kube-api-access-k58zd") pod "77725a9f-be2f-4853-bffa-7087d08a6e89" (UID: "77725a9f-be2f-4853-bffa-7087d08a6e89"). InnerVolumeSpecName "kube-api-access-k58zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.703469 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-config" (OuterVolumeSpecName: "config") pod "77725a9f-be2f-4853-bffa-7087d08a6e89" (UID: "77725a9f-be2f-4853-bffa-7087d08a6e89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.729181 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "77725a9f-be2f-4853-bffa-7087d08a6e89" (UID: "77725a9f-be2f-4853-bffa-7087d08a6e89"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.730156 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "77725a9f-be2f-4853-bffa-7087d08a6e89" (UID: "77725a9f-be2f-4853-bffa-7087d08a6e89"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.731306 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "77725a9f-be2f-4853-bffa-7087d08a6e89" (UID: "77725a9f-be2f-4853-bffa-7087d08a6e89"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.731661 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.731684 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.731701 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k58zd\" (UniqueName: \"kubernetes.io/projected/77725a9f-be2f-4853-bffa-7087d08a6e89-kube-api-access-k58zd\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.731713 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.731723 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.752512 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "77725a9f-be2f-4853-bffa-7087d08a6e89" (UID: "77725a9f-be2f-4853-bffa-7087d08a6e89"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.760946 4885 generic.go:334] "Generic (PLEG): container finished" podID="b800e0bf-89cc-47a6-8984-63b35e27d593" containerID="7fc4384c6371092152456a4cd221ca8f7768425b19ba896b41cb90883cd3fe35" exitCode=0 Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.761038 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" event={"ID":"b800e0bf-89cc-47a6-8984-63b35e27d593","Type":"ContainerDied","Data":"7fc4384c6371092152456a4cd221ca8f7768425b19ba896b41cb90883cd3fe35"} Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.782317 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"70ad27ce-e57b-4dc9-a8c6-95edbd158105","Type":"ContainerStarted","Data":"61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3"} Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.782461 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="70ad27ce-e57b-4dc9-a8c6-95edbd158105" containerName="glance-log" containerID="cri-o://ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd" gracePeriod=30 Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.783204 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="70ad27ce-e57b-4dc9-a8c6-95edbd158105" containerName="glance-httpd" containerID="cri-o://61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3" gracePeriod=30 Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.818439 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a9e9ea-8961-4525-9781-e66e829d1f13","Type":"ContainerStarted","Data":"9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b"} Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.833118 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77725a9f-be2f-4853-bffa-7087d08a6e89-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.841249 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0c75814a-fdba-4523-9f63-1859ada5601d" containerName="glance-log" containerID="cri-o://6ffa8c97a03eca7a23b606c7a59000c732dd8c61f2e987a29cfde1d7f34d8ac5" gracePeriod=30 Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.841324 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c75814a-fdba-4523-9f63-1859ada5601d","Type":"ContainerStarted","Data":"988e964e83214e834daa7ac7b58ac5d4e9002657011a5ce8ff0bcc8f06aef294"} Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.841586 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0c75814a-fdba-4523-9f63-1859ada5601d" containerName="glance-httpd" containerID="cri-o://988e964e83214e834daa7ac7b58ac5d4e9002657011a5ce8ff0bcc8f06aef294" gracePeriod=30 Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.870487 4885 generic.go:334] "Generic (PLEG): container finished" podID="77725a9f-be2f-4853-bffa-7087d08a6e89" containerID="166dc84c795f9be405d4355a07ac1bc8620578b8c9df7c8ffb23e578e964452e" exitCode=0 Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.870649 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" event={"ID":"77725a9f-be2f-4853-bffa-7087d08a6e89","Type":"ContainerDied","Data":"166dc84c795f9be405d4355a07ac1bc8620578b8c9df7c8ffb23e578e964452e"} Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.870730 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" event={"ID":"77725a9f-be2f-4853-bffa-7087d08a6e89","Type":"ContainerDied","Data":"89b61dd6234e3c110fdaac565ebbdd4b0409f044f0e83c504a4a576cb19faf98"} Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.870772 4885 scope.go:117] "RemoveContainer" containerID="166dc84c795f9be405d4355a07ac1bc8620578b8c9df7c8ffb23e578e964452e" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.871380 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c874f55b9-2pvnq" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.883312 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.883295272 podStartE2EDuration="4.883295272s" podCreationTimestamp="2025-12-05 20:24:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:24:55.862448102 +0000 UTC m=+1161.159263783" watchObservedRunningTime="2025-12-05 20:24:55.883295272 +0000 UTC m=+1161.180110933" Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.884046 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"962a5840-991a-4f47-960f-b75f1bc33fa8","Type":"ContainerStarted","Data":"7477b3c060d022f8f50cf4bd9b5f3036aea8e423314f2393aedf009bdffd1f91"} Dec 05 20:24:55 crc kubenswrapper[4885]: I1205 20:24:55.911570 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.911551561 podStartE2EDuration="5.911551561s" podCreationTimestamp="2025-12-05 20:24:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:24:55.897268942 +0000 UTC m=+1161.194084603" watchObservedRunningTime="2025-12-05 20:24:55.911551561 +0000 UTC m=+1161.208367222" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.102434 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c874f55b9-2pvnq"] Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.118598 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c874f55b9-2pvnq"] Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.292822 4885 scope.go:117] "RemoveContainer" containerID="014a02a8608793570cec1d1ada37af2086f5397a323c6bd46dd1f7c69b1a6682" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.450354 4885 scope.go:117] "RemoveContainer" containerID="166dc84c795f9be405d4355a07ac1bc8620578b8c9df7c8ffb23e578e964452e" Dec 05 20:24:56 crc kubenswrapper[4885]: E1205 20:24:56.452124 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"166dc84c795f9be405d4355a07ac1bc8620578b8c9df7c8ffb23e578e964452e\": container with ID starting with 166dc84c795f9be405d4355a07ac1bc8620578b8c9df7c8ffb23e578e964452e not found: ID does not exist" containerID="166dc84c795f9be405d4355a07ac1bc8620578b8c9df7c8ffb23e578e964452e" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.452171 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"166dc84c795f9be405d4355a07ac1bc8620578b8c9df7c8ffb23e578e964452e"} err="failed to get container status \"166dc84c795f9be405d4355a07ac1bc8620578b8c9df7c8ffb23e578e964452e\": rpc error: code = NotFound desc = could not find container \"166dc84c795f9be405d4355a07ac1bc8620578b8c9df7c8ffb23e578e964452e\": container with ID starting with 166dc84c795f9be405d4355a07ac1bc8620578b8c9df7c8ffb23e578e964452e not found: ID does not exist" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.452198 4885 scope.go:117] "RemoveContainer" containerID="014a02a8608793570cec1d1ada37af2086f5397a323c6bd46dd1f7c69b1a6682" Dec 05 20:24:56 crc kubenswrapper[4885]: E1205 20:24:56.452522 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"014a02a8608793570cec1d1ada37af2086f5397a323c6bd46dd1f7c69b1a6682\": container with ID starting with 014a02a8608793570cec1d1ada37af2086f5397a323c6bd46dd1f7c69b1a6682 not found: ID does not exist" containerID="014a02a8608793570cec1d1ada37af2086f5397a323c6bd46dd1f7c69b1a6682" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.452547 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"014a02a8608793570cec1d1ada37af2086f5397a323c6bd46dd1f7c69b1a6682"} err="failed to get container status \"014a02a8608793570cec1d1ada37af2086f5397a323c6bd46dd1f7c69b1a6682\": rpc error: code = NotFound desc = could not find container \"014a02a8608793570cec1d1ada37af2086f5397a323c6bd46dd1f7c69b1a6682\": container with ID starting with 014a02a8608793570cec1d1ada37af2086f5397a323c6bd46dd1f7c69b1a6682 not found: ID does not exist" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.696265 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.697507 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.781392 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6fqj\" (UniqueName: \"kubernetes.io/projected/70ad27ce-e57b-4dc9-a8c6-95edbd158105-kube-api-access-s6fqj\") pod \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.781632 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-config-data\") pod \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.781673 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-scripts\") pod \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.781699 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-combined-ca-bundle\") pod \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.781782 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70ad27ce-e57b-4dc9-a8c6-95edbd158105-httpd-run\") pod \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.781853 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70ad27ce-e57b-4dc9-a8c6-95edbd158105-logs\") pod \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.781897 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\" (UID: \"70ad27ce-e57b-4dc9-a8c6-95edbd158105\") " Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.796110 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ad27ce-e57b-4dc9-a8c6-95edbd158105-kube-api-access-s6fqj" (OuterVolumeSpecName: "kube-api-access-s6fqj") pod "70ad27ce-e57b-4dc9-a8c6-95edbd158105" (UID: "70ad27ce-e57b-4dc9-a8c6-95edbd158105"). InnerVolumeSpecName "kube-api-access-s6fqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.796332 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70ad27ce-e57b-4dc9-a8c6-95edbd158105-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "70ad27ce-e57b-4dc9-a8c6-95edbd158105" (UID: "70ad27ce-e57b-4dc9-a8c6-95edbd158105"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.796631 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70ad27ce-e57b-4dc9-a8c6-95edbd158105-logs" (OuterVolumeSpecName: "logs") pod "70ad27ce-e57b-4dc9-a8c6-95edbd158105" (UID: "70ad27ce-e57b-4dc9-a8c6-95edbd158105"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.820566 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-scripts" (OuterVolumeSpecName: "scripts") pod "70ad27ce-e57b-4dc9-a8c6-95edbd158105" (UID: "70ad27ce-e57b-4dc9-a8c6-95edbd158105"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.828172 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "70ad27ce-e57b-4dc9-a8c6-95edbd158105" (UID: "70ad27ce-e57b-4dc9-a8c6-95edbd158105"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.875080 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7ddb869454-vvfd9" podUID="58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.883927 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70ad27ce-e57b-4dc9-a8c6-95edbd158105-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.883971 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70ad27ce-e57b-4dc9-a8c6-95edbd158105-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.883999 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.884012 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6fqj\" (UniqueName: \"kubernetes.io/projected/70ad27ce-e57b-4dc9-a8c6-95edbd158105-kube-api-access-s6fqj\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.884041 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.897140 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" event={"ID":"b800e0bf-89cc-47a6-8984-63b35e27d593","Type":"ContainerStarted","Data":"0418486839f5070678b358528c2a8c6f2eaf157414b1a65ed15e4b2f23698983"} Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.898190 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.904432 4885 generic.go:334] "Generic (PLEG): container finished" podID="70ad27ce-e57b-4dc9-a8c6-95edbd158105" containerID="61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3" exitCode=143 Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.904459 4885 generic.go:334] "Generic (PLEG): container finished" podID="70ad27ce-e57b-4dc9-a8c6-95edbd158105" containerID="ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd" exitCode=143 Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.904499 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"70ad27ce-e57b-4dc9-a8c6-95edbd158105","Type":"ContainerDied","Data":"61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3"} Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.904517 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"70ad27ce-e57b-4dc9-a8c6-95edbd158105","Type":"ContainerDied","Data":"ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd"} Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.904527 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"70ad27ce-e57b-4dc9-a8c6-95edbd158105","Type":"ContainerDied","Data":"f468ad4a4332462d450398089213aa5ebb5e7f9325e0a3a717fa58de0ea07fb3"} Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.904544 4885 scope.go:117] "RemoveContainer" containerID="61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.904580 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.909243 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"46e5a3b1-b389-45b0-a539-7197ce0b9b4e","Type":"ContainerStarted","Data":"b41ea50d27ecba2edfa16fca65321dff4b320dd87d8d8ac6979364c3c888372d"} Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.916180 4885 generic.go:334] "Generic (PLEG): container finished" podID="0c75814a-fdba-4523-9f63-1859ada5601d" containerID="988e964e83214e834daa7ac7b58ac5d4e9002657011a5ce8ff0bcc8f06aef294" exitCode=143 Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.916218 4885 generic.go:334] "Generic (PLEG): container finished" podID="0c75814a-fdba-4523-9f63-1859ada5601d" containerID="6ffa8c97a03eca7a23b606c7a59000c732dd8c61f2e987a29cfde1d7f34d8ac5" exitCode=143 Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.916284 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c75814a-fdba-4523-9f63-1859ada5601d","Type":"ContainerDied","Data":"988e964e83214e834daa7ac7b58ac5d4e9002657011a5ce8ff0bcc8f06aef294"} Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.916331 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c75814a-fdba-4523-9f63-1859ada5601d","Type":"ContainerDied","Data":"6ffa8c97a03eca7a23b606c7a59000c732dd8c61f2e987a29cfde1d7f34d8ac5"} Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.928543 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" podStartSLOduration=4.928521047 podStartE2EDuration="4.928521047s" podCreationTimestamp="2025-12-05 20:24:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:24:56.922137221 +0000 UTC m=+1162.218952882" watchObservedRunningTime="2025-12-05 20:24:56.928521047 +0000 UTC m=+1162.225336718" Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.931400 4885 generic.go:334] "Generic (PLEG): container finished" podID="88521675-6180-4a17-ba7d-6bb9eb07e7dd" containerID="499638b80cfea95c0f85dd0b05050fd6e7a749f7329e84bb4c6d967622a75e2b" exitCode=0 Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.931538 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5szt6" event={"ID":"88521675-6180-4a17-ba7d-6bb9eb07e7dd","Type":"ContainerDied","Data":"499638b80cfea95c0f85dd0b05050fd6e7a749f7329e84bb4c6d967622a75e2b"} Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.942634 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"962a5840-991a-4f47-960f-b75f1bc33fa8","Type":"ContainerStarted","Data":"82480b8bca44fa4540af3d46d873a0f5a5e6e84917799b9e43d88e1171d88b47"} Dec 05 20:24:56 crc kubenswrapper[4885]: I1205 20:24:56.992452 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70ad27ce-e57b-4dc9-a8c6-95edbd158105" (UID: "70ad27ce-e57b-4dc9-a8c6-95edbd158105"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.001404 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.010738 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-config-data" (OuterVolumeSpecName: "config-data") pod "70ad27ce-e57b-4dc9-a8c6-95edbd158105" (UID: "70ad27ce-e57b-4dc9-a8c6-95edbd158105"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.089806 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.090160 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ad27ce-e57b-4dc9-a8c6-95edbd158105-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.090177 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.091570 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7d9999949d-c22ch" podUID="d0f84b71-1907-4f71-833d-1e5561a4f0f8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.097482 4885 scope.go:117] "RemoveContainer" containerID="ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.099924 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.124333 4885 scope.go:117] "RemoveContainer" containerID="61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3" Dec 05 20:24:57 crc kubenswrapper[4885]: E1205 20:24:57.126304 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3\": container with ID starting with 61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3 not found: ID does not exist" containerID="61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.126343 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3"} err="failed to get container status \"61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3\": rpc error: code = NotFound desc = could not find container \"61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3\": container with ID starting with 61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3 not found: ID does not exist" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.126366 4885 scope.go:117] "RemoveContainer" containerID="ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd" Dec 05 20:24:57 crc kubenswrapper[4885]: E1205 20:24:57.128167 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd\": container with ID starting with ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd not found: ID does not exist" containerID="ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.128190 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd"} err="failed to get container status \"ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd\": rpc error: code = NotFound desc = could not find container \"ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd\": container with ID starting with ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd not found: ID does not exist" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.128206 4885 scope.go:117] "RemoveContainer" containerID="61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.133248 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3"} err="failed to get container status \"61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3\": rpc error: code = NotFound desc = could not find container \"61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3\": container with ID starting with 61c02851313eab3083b11dbd0988e991eb66232fa2cdbc32923c80b9627753e3 not found: ID does not exist" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.133289 4885 scope.go:117] "RemoveContainer" containerID="ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.133663 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd"} err="failed to get container status \"ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd\": rpc error: code = NotFound desc = could not find container \"ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd\": container with ID starting with ab826ea4a30405a3632f341f62232335691888e8bdbf5800752688f1f5f04ffd not found: ID does not exist" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.190443 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77725a9f-be2f-4853-bffa-7087d08a6e89" path="/var/lib/kubelet/pods/77725a9f-be2f-4853-bffa-7087d08a6e89/volumes" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.191410 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c75814a-fdba-4523-9f63-1859ada5601d-logs\") pod \"0c75814a-fdba-4523-9f63-1859ada5601d\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.191444 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-combined-ca-bundle\") pod \"0c75814a-fdba-4523-9f63-1859ada5601d\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.191472 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r426h\" (UniqueName: \"kubernetes.io/projected/0c75814a-fdba-4523-9f63-1859ada5601d-kube-api-access-r426h\") pod \"0c75814a-fdba-4523-9f63-1859ada5601d\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.191489 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-config-data\") pod \"0c75814a-fdba-4523-9f63-1859ada5601d\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.191674 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-scripts\") pod \"0c75814a-fdba-4523-9f63-1859ada5601d\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.191700 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c75814a-fdba-4523-9f63-1859ada5601d-httpd-run\") pod \"0c75814a-fdba-4523-9f63-1859ada5601d\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.191724 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"0c75814a-fdba-4523-9f63-1859ada5601d\" (UID: \"0c75814a-fdba-4523-9f63-1859ada5601d\") " Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.192622 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c75814a-fdba-4523-9f63-1859ada5601d-logs" (OuterVolumeSpecName: "logs") pod "0c75814a-fdba-4523-9f63-1859ada5601d" (UID: "0c75814a-fdba-4523-9f63-1859ada5601d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.194298 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c75814a-fdba-4523-9f63-1859ada5601d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0c75814a-fdba-4523-9f63-1859ada5601d" (UID: "0c75814a-fdba-4523-9f63-1859ada5601d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.217548 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "0c75814a-fdba-4523-9f63-1859ada5601d" (UID: "0c75814a-fdba-4523-9f63-1859ada5601d"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.218310 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-scripts" (OuterVolumeSpecName: "scripts") pod "0c75814a-fdba-4523-9f63-1859ada5601d" (UID: "0c75814a-fdba-4523-9f63-1859ada5601d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.218635 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c75814a-fdba-4523-9f63-1859ada5601d-kube-api-access-r426h" (OuterVolumeSpecName: "kube-api-access-r426h") pod "0c75814a-fdba-4523-9f63-1859ada5601d" (UID: "0c75814a-fdba-4523-9f63-1859ada5601d"). InnerVolumeSpecName "kube-api-access-r426h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.226433 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c75814a-fdba-4523-9f63-1859ada5601d" (UID: "0c75814a-fdba-4523-9f63-1859ada5601d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.277078 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.293174 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-config-data" (OuterVolumeSpecName: "config-data") pod "0c75814a-fdba-4523-9f63-1859ada5601d" (UID: "0c75814a-fdba-4523-9f63-1859ada5601d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.297590 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.309253 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.309285 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c75814a-fdba-4523-9f63-1859ada5601d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.309309 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.309318 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c75814a-fdba-4523-9f63-1859ada5601d-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.309327 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.309336 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c75814a-fdba-4523-9f63-1859ada5601d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.309344 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r426h\" (UniqueName: \"kubernetes.io/projected/0c75814a-fdba-4523-9f63-1859ada5601d-kube-api-access-r426h\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.310478 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:24:57 crc kubenswrapper[4885]: E1205 20:24:57.310856 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ad27ce-e57b-4dc9-a8c6-95edbd158105" containerName="glance-httpd" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.310875 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ad27ce-e57b-4dc9-a8c6-95edbd158105" containerName="glance-httpd" Dec 05 20:24:57 crc kubenswrapper[4885]: E1205 20:24:57.310894 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77725a9f-be2f-4853-bffa-7087d08a6e89" containerName="dnsmasq-dns" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.310903 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="77725a9f-be2f-4853-bffa-7087d08a6e89" containerName="dnsmasq-dns" Dec 05 20:24:57 crc kubenswrapper[4885]: E1205 20:24:57.310917 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c75814a-fdba-4523-9f63-1859ada5601d" containerName="glance-log" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.310923 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c75814a-fdba-4523-9f63-1859ada5601d" containerName="glance-log" Dec 05 20:24:57 crc kubenswrapper[4885]: E1205 20:24:57.310934 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ad27ce-e57b-4dc9-a8c6-95edbd158105" containerName="glance-log" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.310940 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ad27ce-e57b-4dc9-a8c6-95edbd158105" containerName="glance-log" Dec 05 20:24:57 crc kubenswrapper[4885]: E1205 20:24:57.310962 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c75814a-fdba-4523-9f63-1859ada5601d" containerName="glance-httpd" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.310969 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c75814a-fdba-4523-9f63-1859ada5601d" containerName="glance-httpd" Dec 05 20:24:57 crc kubenswrapper[4885]: E1205 20:24:57.310987 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77725a9f-be2f-4853-bffa-7087d08a6e89" containerName="init" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.310995 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="77725a9f-be2f-4853-bffa-7087d08a6e89" containerName="init" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.311198 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c75814a-fdba-4523-9f63-1859ada5601d" containerName="glance-log" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.311220 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ad27ce-e57b-4dc9-a8c6-95edbd158105" containerName="glance-log" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.311229 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c75814a-fdba-4523-9f63-1859ada5601d" containerName="glance-httpd" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.311247 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ad27ce-e57b-4dc9-a8c6-95edbd158105" containerName="glance-httpd" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.311260 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="77725a9f-be2f-4853-bffa-7087d08a6e89" containerName="dnsmasq-dns" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.320326 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.324258 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.324481 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.324715 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.374774 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.410845 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-scripts\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.410908 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.410957 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgtcs\" (UniqueName: \"kubernetes.io/projected/adef1acf-bcea-43fb-a6ad-e4fea6b24643-kube-api-access-lgtcs\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.410986 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-config-data\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.411047 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.411083 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.411122 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/adef1acf-bcea-43fb-a6ad-e4fea6b24643-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.411155 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adef1acf-bcea-43fb-a6ad-e4fea6b24643-logs\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.411226 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.513653 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-scripts\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.514076 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.514129 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgtcs\" (UniqueName: \"kubernetes.io/projected/adef1acf-bcea-43fb-a6ad-e4fea6b24643-kube-api-access-lgtcs\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.514161 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-config-data\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.514211 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.514250 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.514292 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/adef1acf-bcea-43fb-a6ad-e4fea6b24643-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.514332 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adef1acf-bcea-43fb-a6ad-e4fea6b24643-logs\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.514829 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adef1acf-bcea-43fb-a6ad-e4fea6b24643-logs\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.518420 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/adef1acf-bcea-43fb-a6ad-e4fea6b24643-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.518807 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.519319 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.520391 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-scripts\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.522695 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-config-data\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.523682 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.535570 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgtcs\" (UniqueName: \"kubernetes.io/projected/adef1acf-bcea-43fb-a6ad-e4fea6b24643-kube-api-access-lgtcs\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.546390 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.639942 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.956156 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"46e5a3b1-b389-45b0-a539-7197ce0b9b4e","Type":"ContainerStarted","Data":"47b42e7d0f9fe620276453c4bf580e527f85a60f2406dbf360b083caaf8ce747"} Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.957748 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a9e9ea-8961-4525-9781-e66e829d1f13","Type":"ContainerStarted","Data":"1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c"} Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.957886 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a4a9e9ea-8961-4525-9781-e66e829d1f13" containerName="cinder-api-log" containerID="cri-o://9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b" gracePeriod=30 Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.958135 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.958169 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a4a9e9ea-8961-4525-9781-e66e829d1f13" containerName="cinder-api" containerID="cri-o://1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c" gracePeriod=30 Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.964916 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.965094 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c75814a-fdba-4523-9f63-1859ada5601d","Type":"ContainerDied","Data":"46c89aacd30cd723e28c64fbe394efd7c5170398579fd0a60ad2c69e98830d4e"} Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.965152 4885 scope.go:117] "RemoveContainer" containerID="988e964e83214e834daa7ac7b58ac5d4e9002657011a5ce8ff0bcc8f06aef294" Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.978102 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"962a5840-991a-4f47-960f-b75f1bc33fa8","Type":"ContainerStarted","Data":"03a24ddc48d7509b7fe69939b8d26cf41d3ee07988222a741a9025493e4c0d22"} Dec 05 20:24:57 crc kubenswrapper[4885]: I1205 20:24:57.984041 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.031660617 podStartE2EDuration="5.984007437s" podCreationTimestamp="2025-12-05 20:24:52 +0000 UTC" firstStartedPulling="2025-12-05 20:24:54.201750234 +0000 UTC m=+1159.498565895" lastFinishedPulling="2025-12-05 20:24:55.154097054 +0000 UTC m=+1160.450912715" observedRunningTime="2025-12-05 20:24:57.981510051 +0000 UTC m=+1163.278325722" watchObservedRunningTime="2025-12-05 20:24:57.984007437 +0000 UTC m=+1163.280823098" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.016871 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.016852197 podStartE2EDuration="6.016852197s" podCreationTimestamp="2025-12-05 20:24:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:24:58.012345869 +0000 UTC m=+1163.309161530" watchObservedRunningTime="2025-12-05 20:24:58.016852197 +0000 UTC m=+1163.313667858" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.028863 4885 scope.go:117] "RemoveContainer" containerID="6ffa8c97a03eca7a23b606c7a59000c732dd8c61f2e987a29cfde1d7f34d8ac5" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.052854 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.061546 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.078772 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.080658 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.085545 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.085778 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.111628 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.124670 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.124726 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-scripts\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.124790 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.124811 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7wrx\" (UniqueName: \"kubernetes.io/projected/6a2ee42f-a754-4128-a568-f321de7b1beb-kube-api-access-w7wrx\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.124855 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a2ee42f-a754-4128-a568-f321de7b1beb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.124882 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-config-data\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.124950 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2ee42f-a754-4128-a568-f321de7b1beb-logs\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.124991 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.226176 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.226271 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-scripts\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.226379 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.226413 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7wrx\" (UniqueName: \"kubernetes.io/projected/6a2ee42f-a754-4128-a568-f321de7b1beb-kube-api-access-w7wrx\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.226472 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a2ee42f-a754-4128-a568-f321de7b1beb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.226512 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-config-data\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.226632 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2ee42f-a754-4128-a568-f321de7b1beb-logs\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.226674 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.227393 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.228261 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a2ee42f-a754-4128-a568-f321de7b1beb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.230296 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2ee42f-a754-4128-a568-f321de7b1beb-logs\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.234317 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-scripts\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.240497 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.241259 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.245099 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.263575 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7wrx\" (UniqueName: \"kubernetes.io/projected/6a2ee42f-a754-4128-a568-f321de7b1beb-kube-api-access-w7wrx\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.265204 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-config-data\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.268054 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.326556 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.438726 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5szt6" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.447705 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.648063 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/88521675-6180-4a17-ba7d-6bb9eb07e7dd-db-sync-config-data\") pod \"88521675-6180-4a17-ba7d-6bb9eb07e7dd\" (UID: \"88521675-6180-4a17-ba7d-6bb9eb07e7dd\") " Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.648367 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92rv2\" (UniqueName: \"kubernetes.io/projected/88521675-6180-4a17-ba7d-6bb9eb07e7dd-kube-api-access-92rv2\") pod \"88521675-6180-4a17-ba7d-6bb9eb07e7dd\" (UID: \"88521675-6180-4a17-ba7d-6bb9eb07e7dd\") " Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.648527 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88521675-6180-4a17-ba7d-6bb9eb07e7dd-combined-ca-bundle\") pod \"88521675-6180-4a17-ba7d-6bb9eb07e7dd\" (UID: \"88521675-6180-4a17-ba7d-6bb9eb07e7dd\") " Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.665160 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88521675-6180-4a17-ba7d-6bb9eb07e7dd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "88521675-6180-4a17-ba7d-6bb9eb07e7dd" (UID: "88521675-6180-4a17-ba7d-6bb9eb07e7dd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.684310 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88521675-6180-4a17-ba7d-6bb9eb07e7dd-kube-api-access-92rv2" (OuterVolumeSpecName: "kube-api-access-92rv2") pod "88521675-6180-4a17-ba7d-6bb9eb07e7dd" (UID: "88521675-6180-4a17-ba7d-6bb9eb07e7dd"). InnerVolumeSpecName "kube-api-access-92rv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.766120 4885 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/88521675-6180-4a17-ba7d-6bb9eb07e7dd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.766189 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92rv2\" (UniqueName: \"kubernetes.io/projected/88521675-6180-4a17-ba7d-6bb9eb07e7dd-kube-api-access-92rv2\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.808808 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88521675-6180-4a17-ba7d-6bb9eb07e7dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88521675-6180-4a17-ba7d-6bb9eb07e7dd" (UID: "88521675-6180-4a17-ba7d-6bb9eb07e7dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.867742 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88521675-6180-4a17-ba7d-6bb9eb07e7dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:58 crc kubenswrapper[4885]: I1205 20:24:58.998220 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.001633 4885 generic.go:334] "Generic (PLEG): container finished" podID="a4a9e9ea-8961-4525-9781-e66e829d1f13" containerID="1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c" exitCode=0 Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.001662 4885 generic.go:334] "Generic (PLEG): container finished" podID="a4a9e9ea-8961-4525-9781-e66e829d1f13" containerID="9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b" exitCode=143 Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.001715 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a9e9ea-8961-4525-9781-e66e829d1f13","Type":"ContainerDied","Data":"1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c"} Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.001749 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a9e9ea-8961-4525-9781-e66e829d1f13","Type":"ContainerDied","Data":"9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b"} Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.001764 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a9e9ea-8961-4525-9781-e66e829d1f13","Type":"ContainerDied","Data":"e42c28b1c161fd6089547ebbe6e46c73f9484f28c4b904614326fe401ebdd4f5"} Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.001782 4885 scope.go:117] "RemoveContainer" containerID="1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.007730 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"adef1acf-bcea-43fb-a6ad-e4fea6b24643","Type":"ContainerStarted","Data":"b04b247389d6f41f39a5662af344993783aeaf9355c9d6fa9e7bcae64e652c12"} Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.032107 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5szt6" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.032972 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5szt6" event={"ID":"88521675-6180-4a17-ba7d-6bb9eb07e7dd","Type":"ContainerDied","Data":"c4bcfbf3c09021cd5f4710dcdb41d4c360839690f780a1b9c2fe5a8f0f9658c5"} Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.033332 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4bcfbf3c09021cd5f4710dcdb41d4c360839690f780a1b9c2fe5a8f0f9658c5" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.058528 4885 scope.go:117] "RemoveContainer" containerID="9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.072985 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4a9e9ea-8961-4525-9781-e66e829d1f13-etc-machine-id\") pod \"a4a9e9ea-8961-4525-9781-e66e829d1f13\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.073055 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-scripts\") pod \"a4a9e9ea-8961-4525-9781-e66e829d1f13\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.073076 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w7hp\" (UniqueName: \"kubernetes.io/projected/a4a9e9ea-8961-4525-9781-e66e829d1f13-kube-api-access-8w7hp\") pod \"a4a9e9ea-8961-4525-9781-e66e829d1f13\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.073138 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-combined-ca-bundle\") pod \"a4a9e9ea-8961-4525-9781-e66e829d1f13\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.073172 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-config-data-custom\") pod \"a4a9e9ea-8961-4525-9781-e66e829d1f13\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.073192 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-config-data\") pod \"a4a9e9ea-8961-4525-9781-e66e829d1f13\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.073221 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a9e9ea-8961-4525-9781-e66e829d1f13-logs\") pod \"a4a9e9ea-8961-4525-9781-e66e829d1f13\" (UID: \"a4a9e9ea-8961-4525-9781-e66e829d1f13\") " Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.081099 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4a9e9ea-8961-4525-9781-e66e829d1f13-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a4a9e9ea-8961-4525-9781-e66e829d1f13" (UID: "a4a9e9ea-8961-4525-9781-e66e829d1f13"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.094393 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a9e9ea-8961-4525-9781-e66e829d1f13-kube-api-access-8w7hp" (OuterVolumeSpecName: "kube-api-access-8w7hp") pod "a4a9e9ea-8961-4525-9781-e66e829d1f13" (UID: "a4a9e9ea-8961-4525-9781-e66e829d1f13"). InnerVolumeSpecName "kube-api-access-8w7hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.097142 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a9e9ea-8961-4525-9781-e66e829d1f13-logs" (OuterVolumeSpecName: "logs") pod "a4a9e9ea-8961-4525-9781-e66e829d1f13" (UID: "a4a9e9ea-8961-4525-9781-e66e829d1f13"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.108262 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-scripts" (OuterVolumeSpecName: "scripts") pod "a4a9e9ea-8961-4525-9781-e66e829d1f13" (UID: "a4a9e9ea-8961-4525-9781-e66e829d1f13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.121012 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a4a9e9ea-8961-4525-9781-e66e829d1f13" (UID: "a4a9e9ea-8961-4525-9781-e66e829d1f13"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.151785 4885 scope.go:117] "RemoveContainer" containerID="1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c" Dec 05 20:24:59 crc kubenswrapper[4885]: E1205 20:24:59.161365 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c\": container with ID starting with 1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c not found: ID does not exist" containerID="1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.161430 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c"} err="failed to get container status \"1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c\": rpc error: code = NotFound desc = could not find container \"1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c\": container with ID starting with 1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c not found: ID does not exist" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.161461 4885 scope.go:117] "RemoveContainer" containerID="9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.162600 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4a9e9ea-8961-4525-9781-e66e829d1f13" (UID: "a4a9e9ea-8961-4525-9781-e66e829d1f13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:59 crc kubenswrapper[4885]: E1205 20:24:59.165589 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b\": container with ID starting with 9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b not found: ID does not exist" containerID="9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.165634 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b"} err="failed to get container status \"9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b\": rpc error: code = NotFound desc = could not find container \"9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b\": container with ID starting with 9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b not found: ID does not exist" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.165665 4885 scope.go:117] "RemoveContainer" containerID="1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.198443 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c"} err="failed to get container status \"1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c\": rpc error: code = NotFound desc = could not find container \"1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c\": container with ID starting with 1fa65f70ae3941218ba43a05b34880c21c3c1c8ac644794f040265d5e84e0c2c not found: ID does not exist" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.198533 4885 scope.go:117] "RemoveContainer" containerID="9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.200356 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.200388 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a9e9ea-8961-4525-9781-e66e829d1f13-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.200402 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4a9e9ea-8961-4525-9781-e66e829d1f13-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.200414 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.200427 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w7hp\" (UniqueName: \"kubernetes.io/projected/a4a9e9ea-8961-4525-9781-e66e829d1f13-kube-api-access-8w7hp\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.200441 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.202259 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b"} err="failed to get container status \"9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b\": rpc error: code = NotFound desc = could not find container \"9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b\": container with ID starting with 9f30bd0a2d6afe28a52a683b174680915f0f04ca2c24d07c02673fdae453247b not found: ID does not exist" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.209179 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-config-data" (OuterVolumeSpecName: "config-data") pod "a4a9e9ea-8961-4525-9781-e66e829d1f13" (UID: "a4a9e9ea-8961-4525-9781-e66e829d1f13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.302349 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a9e9ea-8961-4525-9781-e66e829d1f13-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.457232 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c75814a-fdba-4523-9f63-1859ada5601d" path="/var/lib/kubelet/pods/0c75814a-fdba-4523-9f63-1859ada5601d/volumes" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.458939 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70ad27ce-e57b-4dc9-a8c6-95edbd158105" path="/var/lib/kubelet/pods/70ad27ce-e57b-4dc9-a8c6-95edbd158105/volumes" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.462750 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-549d6dd897-jd542"] Dec 05 20:24:59 crc kubenswrapper[4885]: E1205 20:24:59.463124 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88521675-6180-4a17-ba7d-6bb9eb07e7dd" containerName="barbican-db-sync" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.463142 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="88521675-6180-4a17-ba7d-6bb9eb07e7dd" containerName="barbican-db-sync" Dec 05 20:24:59 crc kubenswrapper[4885]: E1205 20:24:59.463153 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a9e9ea-8961-4525-9781-e66e829d1f13" containerName="cinder-api" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.463160 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a9e9ea-8961-4525-9781-e66e829d1f13" containerName="cinder-api" Dec 05 20:24:59 crc kubenswrapper[4885]: E1205 20:24:59.463184 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a9e9ea-8961-4525-9781-e66e829d1f13" containerName="cinder-api-log" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.463191 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a9e9ea-8961-4525-9781-e66e829d1f13" containerName="cinder-api-log" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.463342 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a9e9ea-8961-4525-9781-e66e829d1f13" containerName="cinder-api-log" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.463363 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a9e9ea-8961-4525-9781-e66e829d1f13" containerName="cinder-api" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.463382 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="88521675-6180-4a17-ba7d-6bb9eb07e7dd" containerName="barbican-db-sync" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.464335 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-549d6dd897-jd542"] Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.464360 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-f44776d88-2k4qb"] Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.465442 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f44776d88-2k4qb"] Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.465466 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d6cb77c59-k7mtf"] Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.465480 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-894d58c65-zbm4r"] Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.466987 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-894d58c65-zbm4r"] Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.467010 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5ff897b49b-mz22t"] Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.468194 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.468614 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.468935 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.469607 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.474212 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.474382 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ksdtr" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.474736 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.475649 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.475819 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 05 20:24:59 crc kubenswrapper[4885]: W1205 20:24:59.494306 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a2ee42f_a754_4128_a568_f321de7b1beb.slice/crio-70973bd42e9dabfc5e7aed5042044e739a5298534c7fc49813cd0bc2d79faaa2 WatchSource:0}: Error finding container 70973bd42e9dabfc5e7aed5042044e739a5298534c7fc49813cd0bc2d79faaa2: Status 404 returned error can't find the container with id 70973bd42e9dabfc5e7aed5042044e739a5298534c7fc49813cd0bc2d79faaa2 Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.509828 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84964967-0f37-47c8-919f-3a68040a1d36-logs\") pod \"barbican-keystone-listener-f44776d88-2k4qb\" (UID: \"84964967-0f37-47c8-919f-3a68040a1d36\") " pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.510132 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84964967-0f37-47c8-919f-3a68040a1d36-config-data-custom\") pod \"barbican-keystone-listener-f44776d88-2k4qb\" (UID: \"84964967-0f37-47c8-919f-3a68040a1d36\") " pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.510234 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f62qk\" (UniqueName: \"kubernetes.io/projected/b6418540-35fb-49a0-8e02-8540c41d59f1-kube-api-access-f62qk\") pod \"barbican-api-5ff897b49b-mz22t\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.510309 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6418540-35fb-49a0-8e02-8540c41d59f1-logs\") pod \"barbican-api-5ff897b49b-mz22t\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.510408 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-ovsdbserver-sb\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.510514 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10197ca-8886-4668-b3e8-1179bdb7041d-combined-ca-bundle\") pod \"barbican-worker-549d6dd897-jd542\" (UID: \"f10197ca-8886-4668-b3e8-1179bdb7041d\") " pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.510595 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-combined-ca-bundle\") pod \"barbican-api-5ff897b49b-mz22t\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.510695 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f10197ca-8886-4668-b3e8-1179bdb7041d-logs\") pod \"barbican-worker-549d6dd897-jd542\" (UID: \"f10197ca-8886-4668-b3e8-1179bdb7041d\") " pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.510793 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-dns-svc\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.510901 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzr5k\" (UniqueName: \"kubernetes.io/projected/84964967-0f37-47c8-919f-3a68040a1d36-kube-api-access-bzr5k\") pod \"barbican-keystone-listener-f44776d88-2k4qb\" (UID: \"84964967-0f37-47c8-919f-3a68040a1d36\") " pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.511040 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f10197ca-8886-4668-b3e8-1179bdb7041d-config-data-custom\") pod \"barbican-worker-549d6dd897-jd542\" (UID: \"f10197ca-8886-4668-b3e8-1179bdb7041d\") " pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.511120 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-dns-swift-storage-0\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.511204 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84964967-0f37-47c8-919f-3a68040a1d36-combined-ca-bundle\") pod \"barbican-keystone-listener-f44776d88-2k4qb\" (UID: \"84964967-0f37-47c8-919f-3a68040a1d36\") " pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.511297 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bds5g\" (UniqueName: \"kubernetes.io/projected/f10197ca-8886-4668-b3e8-1179bdb7041d-kube-api-access-bds5g\") pod \"barbican-worker-549d6dd897-jd542\" (UID: \"f10197ca-8886-4668-b3e8-1179bdb7041d\") " pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.511381 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10197ca-8886-4668-b3e8-1179bdb7041d-config-data\") pod \"barbican-worker-549d6dd897-jd542\" (UID: \"f10197ca-8886-4668-b3e8-1179bdb7041d\") " pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.511455 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-ovsdbserver-nb\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.513203 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-config-data\") pod \"barbican-api-5ff897b49b-mz22t\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.518090 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-config-data-custom\") pod \"barbican-api-5ff897b49b-mz22t\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.519564 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-config\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.519692 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84964967-0f37-47c8-919f-3a68040a1d36-config-data\") pod \"barbican-keystone-listener-f44776d88-2k4qb\" (UID: \"84964967-0f37-47c8-919f-3a68040a1d36\") " pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.519779 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kggkq\" (UniqueName: \"kubernetes.io/projected/6a656ed8-8495-40ab-a37e-10f50b7eb513-kube-api-access-kggkq\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.515885 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5ff897b49b-mz22t"] Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.551042 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.621621 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f62qk\" (UniqueName: \"kubernetes.io/projected/b6418540-35fb-49a0-8e02-8540c41d59f1-kube-api-access-f62qk\") pod \"barbican-api-5ff897b49b-mz22t\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.622059 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6418540-35fb-49a0-8e02-8540c41d59f1-logs\") pod \"barbican-api-5ff897b49b-mz22t\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.622182 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-ovsdbserver-sb\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.622306 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10197ca-8886-4668-b3e8-1179bdb7041d-combined-ca-bundle\") pod \"barbican-worker-549d6dd897-jd542\" (UID: \"f10197ca-8886-4668-b3e8-1179bdb7041d\") " pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.622419 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-combined-ca-bundle\") pod \"barbican-api-5ff897b49b-mz22t\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.622529 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f10197ca-8886-4668-b3e8-1179bdb7041d-logs\") pod \"barbican-worker-549d6dd897-jd542\" (UID: \"f10197ca-8886-4668-b3e8-1179bdb7041d\") " pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.622630 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-dns-svc\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.622726 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzr5k\" (UniqueName: \"kubernetes.io/projected/84964967-0f37-47c8-919f-3a68040a1d36-kube-api-access-bzr5k\") pod \"barbican-keystone-listener-f44776d88-2k4qb\" (UID: \"84964967-0f37-47c8-919f-3a68040a1d36\") " pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.622835 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f10197ca-8886-4668-b3e8-1179bdb7041d-config-data-custom\") pod \"barbican-worker-549d6dd897-jd542\" (UID: \"f10197ca-8886-4668-b3e8-1179bdb7041d\") " pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.623252 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-dns-swift-storage-0\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.623390 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84964967-0f37-47c8-919f-3a68040a1d36-combined-ca-bundle\") pod \"barbican-keystone-listener-f44776d88-2k4qb\" (UID: \"84964967-0f37-47c8-919f-3a68040a1d36\") " pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.623509 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bds5g\" (UniqueName: \"kubernetes.io/projected/f10197ca-8886-4668-b3e8-1179bdb7041d-kube-api-access-bds5g\") pod \"barbican-worker-549d6dd897-jd542\" (UID: \"f10197ca-8886-4668-b3e8-1179bdb7041d\") " pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.623617 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10197ca-8886-4668-b3e8-1179bdb7041d-config-data\") pod \"barbican-worker-549d6dd897-jd542\" (UID: \"f10197ca-8886-4668-b3e8-1179bdb7041d\") " pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.623711 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-ovsdbserver-nb\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.623842 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-config-data\") pod \"barbican-api-5ff897b49b-mz22t\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.623973 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-config-data-custom\") pod \"barbican-api-5ff897b49b-mz22t\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.624186 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-config\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.624292 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-dns-svc\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.624402 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84964967-0f37-47c8-919f-3a68040a1d36-config-data\") pod \"barbican-keystone-listener-f44776d88-2k4qb\" (UID: \"84964967-0f37-47c8-919f-3a68040a1d36\") " pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.624564 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kggkq\" (UniqueName: \"kubernetes.io/projected/6a656ed8-8495-40ab-a37e-10f50b7eb513-kube-api-access-kggkq\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.624737 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84964967-0f37-47c8-919f-3a68040a1d36-logs\") pod \"barbican-keystone-listener-f44776d88-2k4qb\" (UID: \"84964967-0f37-47c8-919f-3a68040a1d36\") " pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.624913 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84964967-0f37-47c8-919f-3a68040a1d36-config-data-custom\") pod \"barbican-keystone-listener-f44776d88-2k4qb\" (UID: \"84964967-0f37-47c8-919f-3a68040a1d36\") " pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.625740 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6418540-35fb-49a0-8e02-8540c41d59f1-logs\") pod \"barbican-api-5ff897b49b-mz22t\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.625851 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-ovsdbserver-sb\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.626218 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-dns-swift-storage-0\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.626994 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-config\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.624645 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f10197ca-8886-4668-b3e8-1179bdb7041d-logs\") pod \"barbican-worker-549d6dd897-jd542\" (UID: \"f10197ca-8886-4668-b3e8-1179bdb7041d\") " pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.630697 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-ovsdbserver-nb\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.630977 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-combined-ca-bundle\") pod \"barbican-api-5ff897b49b-mz22t\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.632397 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84964967-0f37-47c8-919f-3a68040a1d36-combined-ca-bundle\") pod \"barbican-keystone-listener-f44776d88-2k4qb\" (UID: \"84964967-0f37-47c8-919f-3a68040a1d36\") " pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.632960 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84964967-0f37-47c8-919f-3a68040a1d36-logs\") pod \"barbican-keystone-listener-f44776d88-2k4qb\" (UID: \"84964967-0f37-47c8-919f-3a68040a1d36\") " pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.633569 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10197ca-8886-4668-b3e8-1179bdb7041d-combined-ca-bundle\") pod \"barbican-worker-549d6dd897-jd542\" (UID: \"f10197ca-8886-4668-b3e8-1179bdb7041d\") " pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.637948 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f10197ca-8886-4668-b3e8-1179bdb7041d-config-data-custom\") pod \"barbican-worker-549d6dd897-jd542\" (UID: \"f10197ca-8886-4668-b3e8-1179bdb7041d\") " pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.638094 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-config-data-custom\") pod \"barbican-api-5ff897b49b-mz22t\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.640247 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzr5k\" (UniqueName: \"kubernetes.io/projected/84964967-0f37-47c8-919f-3a68040a1d36-kube-api-access-bzr5k\") pod \"barbican-keystone-listener-f44776d88-2k4qb\" (UID: \"84964967-0f37-47c8-919f-3a68040a1d36\") " pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.644229 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10197ca-8886-4668-b3e8-1179bdb7041d-config-data\") pod \"barbican-worker-549d6dd897-jd542\" (UID: \"f10197ca-8886-4668-b3e8-1179bdb7041d\") " pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.644544 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kggkq\" (UniqueName: \"kubernetes.io/projected/6a656ed8-8495-40ab-a37e-10f50b7eb513-kube-api-access-kggkq\") pod \"dnsmasq-dns-894d58c65-zbm4r\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.646042 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-config-data\") pod \"barbican-api-5ff897b49b-mz22t\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.646319 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84964967-0f37-47c8-919f-3a68040a1d36-config-data\") pod \"barbican-keystone-listener-f44776d88-2k4qb\" (UID: \"84964967-0f37-47c8-919f-3a68040a1d36\") " pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.646336 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84964967-0f37-47c8-919f-3a68040a1d36-config-data-custom\") pod \"barbican-keystone-listener-f44776d88-2k4qb\" (UID: \"84964967-0f37-47c8-919f-3a68040a1d36\") " pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.647150 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f62qk\" (UniqueName: \"kubernetes.io/projected/b6418540-35fb-49a0-8e02-8540c41d59f1-kube-api-access-f62qk\") pod \"barbican-api-5ff897b49b-mz22t\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.655318 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bds5g\" (UniqueName: \"kubernetes.io/projected/f10197ca-8886-4668-b3e8-1179bdb7041d-kube-api-access-bds5g\") pod \"barbican-worker-549d6dd897-jd542\" (UID: \"f10197ca-8886-4668-b3e8-1179bdb7041d\") " pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.904269 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.904915 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.907696 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-549d6dd897-jd542" Dec 05 20:24:59 crc kubenswrapper[4885]: I1205 20:24:59.908159 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.058799 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"adef1acf-bcea-43fb-a6ad-e4fea6b24643","Type":"ContainerStarted","Data":"718b07ee517bcbc2f813f76df0a609506d09f57622f85d61de7d11107b153421"} Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.061781 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"962a5840-991a-4f47-960f-b75f1bc33fa8","Type":"ContainerStarted","Data":"13798c03742164cd1289573e7496d5961fc8ec84954a964eac4af8c94c8e774e"} Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.061853 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.066489 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.071618 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a2ee42f-a754-4128-a568-f321de7b1beb","Type":"ContainerStarted","Data":"70973bd42e9dabfc5e7aed5042044e739a5298534c7fc49813cd0bc2d79faaa2"} Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.071803 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" podUID="b800e0bf-89cc-47a6-8984-63b35e27d593" containerName="dnsmasq-dns" containerID="cri-o://0418486839f5070678b358528c2a8c6f2eaf157414b1a65ed15e4b2f23698983" gracePeriod=10 Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.096178 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.472523823 podStartE2EDuration="8.096161804s" podCreationTimestamp="2025-12-05 20:24:52 +0000 UTC" firstStartedPulling="2025-12-05 20:24:53.918904068 +0000 UTC m=+1159.215719729" lastFinishedPulling="2025-12-05 20:24:58.542542049 +0000 UTC m=+1163.839357710" observedRunningTime="2025-12-05 20:25:00.088640423 +0000 UTC m=+1165.385456094" watchObservedRunningTime="2025-12-05 20:25:00.096161804 +0000 UTC m=+1165.392977465" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.112053 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.136122 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.179524 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.198535 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.198681 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.206361 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.206568 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.206706 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.251807 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/232e06c4-ecaf-4959-b1e2-0c183f6afb64-etc-machine-id\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.252209 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-public-tls-certs\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.252314 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.252397 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-config-data-custom\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.252482 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-scripts\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.252555 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.252619 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/232e06c4-ecaf-4959-b1e2-0c183f6afb64-logs\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.252749 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr67l\" (UniqueName: \"kubernetes.io/projected/232e06c4-ecaf-4959-b1e2-0c183f6afb64-kube-api-access-cr67l\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.252925 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-config-data\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.355700 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-config-data-custom\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.355941 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-scripts\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.356001 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.356050 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/232e06c4-ecaf-4959-b1e2-0c183f6afb64-logs\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.356067 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr67l\" (UniqueName: \"kubernetes.io/projected/232e06c4-ecaf-4959-b1e2-0c183f6afb64-kube-api-access-cr67l\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.356147 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-config-data\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.356251 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/232e06c4-ecaf-4959-b1e2-0c183f6afb64-etc-machine-id\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.356315 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-public-tls-certs\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.356338 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.359231 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/232e06c4-ecaf-4959-b1e2-0c183f6afb64-etc-machine-id\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.359315 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/232e06c4-ecaf-4959-b1e2-0c183f6afb64-logs\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.361882 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.362336 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.363825 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-config-data-custom\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.364393 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-config-data\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.364757 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-public-tls-certs\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.368627 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/232e06c4-ecaf-4959-b1e2-0c183f6afb64-scripts\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.378270 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr67l\" (UniqueName: \"kubernetes.io/projected/232e06c4-ecaf-4959-b1e2-0c183f6afb64-kube-api-access-cr67l\") pod \"cinder-api-0\" (UID: \"232e06c4-ecaf-4959-b1e2-0c183f6afb64\") " pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.455753 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-894d58c65-zbm4r"] Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.530138 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.586964 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-549d6dd897-jd542"] Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.605468 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f44776d88-2k4qb"] Dec 05 20:25:00 crc kubenswrapper[4885]: W1205 20:25:00.655482 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84964967_0f37_47c8_919f_3a68040a1d36.slice/crio-261e0664928ecb9ad2cfec0bef188008afcef90d750408f1451f2fe7f9473198 WatchSource:0}: Error finding container 261e0664928ecb9ad2cfec0bef188008afcef90d750408f1451f2fe7f9473198: Status 404 returned error can't find the container with id 261e0664928ecb9ad2cfec0bef188008afcef90d750408f1451f2fe7f9473198 Dec 05 20:25:00 crc kubenswrapper[4885]: I1205 20:25:00.815651 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5ff897b49b-mz22t"] Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.014721 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.072595 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-ovsdbserver-nb\") pod \"b800e0bf-89cc-47a6-8984-63b35e27d593\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.072687 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-dns-svc\") pod \"b800e0bf-89cc-47a6-8984-63b35e27d593\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.072775 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8956\" (UniqueName: \"kubernetes.io/projected/b800e0bf-89cc-47a6-8984-63b35e27d593-kube-api-access-g8956\") pod \"b800e0bf-89cc-47a6-8984-63b35e27d593\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.072804 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-ovsdbserver-sb\") pod \"b800e0bf-89cc-47a6-8984-63b35e27d593\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.072823 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-dns-swift-storage-0\") pod \"b800e0bf-89cc-47a6-8984-63b35e27d593\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.072866 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-config\") pod \"b800e0bf-89cc-47a6-8984-63b35e27d593\" (UID: \"b800e0bf-89cc-47a6-8984-63b35e27d593\") " Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.095449 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b800e0bf-89cc-47a6-8984-63b35e27d593-kube-api-access-g8956" (OuterVolumeSpecName: "kube-api-access-g8956") pod "b800e0bf-89cc-47a6-8984-63b35e27d593" (UID: "b800e0bf-89cc-47a6-8984-63b35e27d593"). InnerVolumeSpecName "kube-api-access-g8956". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.101922 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" event={"ID":"84964967-0f37-47c8-919f-3a68040a1d36","Type":"ContainerStarted","Data":"261e0664928ecb9ad2cfec0bef188008afcef90d750408f1451f2fe7f9473198"} Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.122930 4885 generic.go:334] "Generic (PLEG): container finished" podID="6a656ed8-8495-40ab-a37e-10f50b7eb513" containerID="a61621e78d4ec6c77d02558e97f0a20f39d377bba59fc968c55f1fae1bc3ec50" exitCode=0 Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.123091 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-894d58c65-zbm4r" event={"ID":"6a656ed8-8495-40ab-a37e-10f50b7eb513","Type":"ContainerDied","Data":"a61621e78d4ec6c77d02558e97f0a20f39d377bba59fc968c55f1fae1bc3ec50"} Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.123279 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-894d58c65-zbm4r" event={"ID":"6a656ed8-8495-40ab-a37e-10f50b7eb513","Type":"ContainerStarted","Data":"da87d526edf9a4c927ed6b15fb59c60a33d065322e85e05cd4b73cd4f21a1d5e"} Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.127916 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5ff897b49b-mz22t" event={"ID":"b6418540-35fb-49a0-8e02-8540c41d59f1","Type":"ContainerStarted","Data":"4ba494d34d8de0a4199402322f9601dc9a8e575876a89a8f1e21df9ca8cb6408"} Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.136417 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"adef1acf-bcea-43fb-a6ad-e4fea6b24643","Type":"ContainerStarted","Data":"2d3307ebd18ff387a0240c326fb657960c0a774cb4ed54ed7b096e1a936acbd9"} Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.137992 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-549d6dd897-jd542" event={"ID":"f10197ca-8886-4668-b3e8-1179bdb7041d","Type":"ContainerStarted","Data":"a6d4002ccb01718b2c48dff92404e996cef0c99f3268ba9347a387b4bb3cde66"} Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.163131 4885 generic.go:334] "Generic (PLEG): container finished" podID="b800e0bf-89cc-47a6-8984-63b35e27d593" containerID="0418486839f5070678b358528c2a8c6f2eaf157414b1a65ed15e4b2f23698983" exitCode=0 Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.163188 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" event={"ID":"b800e0bf-89cc-47a6-8984-63b35e27d593","Type":"ContainerDied","Data":"0418486839f5070678b358528c2a8c6f2eaf157414b1a65ed15e4b2f23698983"} Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.163213 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" event={"ID":"b800e0bf-89cc-47a6-8984-63b35e27d593","Type":"ContainerDied","Data":"18b9fa15912b5e9c2c564d5e8dfadccad08023f82b9a555ccb7936c26d6589d7"} Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.163230 4885 scope.go:117] "RemoveContainer" containerID="0418486839f5070678b358528c2a8c6f2eaf157414b1a65ed15e4b2f23698983" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.163352 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6cb77c59-k7mtf" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.175829 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8956\" (UniqueName: \"kubernetes.io/projected/b800e0bf-89cc-47a6-8984-63b35e27d593-kube-api-access-g8956\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.184528 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.1845126950000004 podStartE2EDuration="4.184512695s" podCreationTimestamp="2025-12-05 20:24:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:25:01.164359046 +0000 UTC m=+1166.461174707" watchObservedRunningTime="2025-12-05 20:25:01.184512695 +0000 UTC m=+1166.481328356" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.195231 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a9e9ea-8961-4525-9781-e66e829d1f13" path="/var/lib/kubelet/pods/a4a9e9ea-8961-4525-9781-e66e829d1f13/volumes" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.205276 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a2ee42f-a754-4128-a568-f321de7b1beb","Type":"ContainerStarted","Data":"6ab208155a9cb0552587aea741d5d3637c7f1b625991327a2e713b8532ee3134"} Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.214212 4885 scope.go:117] "RemoveContainer" containerID="7fc4384c6371092152456a4cd221ca8f7768425b19ba896b41cb90883cd3fe35" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.241903 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b800e0bf-89cc-47a6-8984-63b35e27d593" (UID: "b800e0bf-89cc-47a6-8984-63b35e27d593"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.245239 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b800e0bf-89cc-47a6-8984-63b35e27d593" (UID: "b800e0bf-89cc-47a6-8984-63b35e27d593"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.245319 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b800e0bf-89cc-47a6-8984-63b35e27d593" (UID: "b800e0bf-89cc-47a6-8984-63b35e27d593"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.255569 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.255551869 podStartE2EDuration="3.255551869s" podCreationTimestamp="2025-12-05 20:24:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:25:01.241653572 +0000 UTC m=+1166.538469233" watchObservedRunningTime="2025-12-05 20:25:01.255551869 +0000 UTC m=+1166.552367530" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.256812 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.277254 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-config" (OuterVolumeSpecName: "config") pod "b800e0bf-89cc-47a6-8984-63b35e27d593" (UID: "b800e0bf-89cc-47a6-8984-63b35e27d593"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.278286 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.278325 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.278335 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.278343 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.297191 4885 scope.go:117] "RemoveContainer" containerID="0418486839f5070678b358528c2a8c6f2eaf157414b1a65ed15e4b2f23698983" Dec 05 20:25:01 crc kubenswrapper[4885]: E1205 20:25:01.305572 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0418486839f5070678b358528c2a8c6f2eaf157414b1a65ed15e4b2f23698983\": container with ID starting with 0418486839f5070678b358528c2a8c6f2eaf157414b1a65ed15e4b2f23698983 not found: ID does not exist" containerID="0418486839f5070678b358528c2a8c6f2eaf157414b1a65ed15e4b2f23698983" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.305619 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0418486839f5070678b358528c2a8c6f2eaf157414b1a65ed15e4b2f23698983"} err="failed to get container status \"0418486839f5070678b358528c2a8c6f2eaf157414b1a65ed15e4b2f23698983\": rpc error: code = NotFound desc = could not find container \"0418486839f5070678b358528c2a8c6f2eaf157414b1a65ed15e4b2f23698983\": container with ID starting with 0418486839f5070678b358528c2a8c6f2eaf157414b1a65ed15e4b2f23698983 not found: ID does not exist" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.305644 4885 scope.go:117] "RemoveContainer" containerID="7fc4384c6371092152456a4cd221ca8f7768425b19ba896b41cb90883cd3fe35" Dec 05 20:25:01 crc kubenswrapper[4885]: E1205 20:25:01.309949 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fc4384c6371092152456a4cd221ca8f7768425b19ba896b41cb90883cd3fe35\": container with ID starting with 7fc4384c6371092152456a4cd221ca8f7768425b19ba896b41cb90883cd3fe35 not found: ID does not exist" containerID="7fc4384c6371092152456a4cd221ca8f7768425b19ba896b41cb90883cd3fe35" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.309979 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fc4384c6371092152456a4cd221ca8f7768425b19ba896b41cb90883cd3fe35"} err="failed to get container status \"7fc4384c6371092152456a4cd221ca8f7768425b19ba896b41cb90883cd3fe35\": rpc error: code = NotFound desc = could not find container \"7fc4384c6371092152456a4cd221ca8f7768425b19ba896b41cb90883cd3fe35\": container with ID starting with 7fc4384c6371092152456a4cd221ca8f7768425b19ba896b41cb90883cd3fe35 not found: ID does not exist" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.322287 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b800e0bf-89cc-47a6-8984-63b35e27d593" (UID: "b800e0bf-89cc-47a6-8984-63b35e27d593"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.383903 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b800e0bf-89cc-47a6-8984-63b35e27d593-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.507512 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d6cb77c59-k7mtf"] Dec 05 20:25:01 crc kubenswrapper[4885]: I1205 20:25:01.516452 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d6cb77c59-k7mtf"] Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.189816 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"232e06c4-ecaf-4959-b1e2-0c183f6afb64","Type":"ContainerStarted","Data":"65561322d17a6668558e8a6167bb5fb15134f0c8b81be4b51582419e72d04bd9"} Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.194843 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a2ee42f-a754-4128-a568-f321de7b1beb","Type":"ContainerStarted","Data":"7029cca0d3d40d2220c1c94a687d5932e7c88807b0abab2d26a4166e2917d62f"} Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.197652 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-894d58c65-zbm4r" event={"ID":"6a656ed8-8495-40ab-a37e-10f50b7eb513","Type":"ContainerStarted","Data":"16796d5140408d20de6741a93998d7b9fdf4c7bead09366b51b45581442e0242"} Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.198717 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.201218 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5ff897b49b-mz22t" event={"ID":"b6418540-35fb-49a0-8e02-8540c41d59f1","Type":"ContainerStarted","Data":"3a0107ae0a3e4161f64caa2c0528076f60e8067d2eb31b066e23c73178dcbb9e"} Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.201244 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5ff897b49b-mz22t" event={"ID":"b6418540-35fb-49a0-8e02-8540c41d59f1","Type":"ContainerStarted","Data":"b9476cb53586e59a9259b0918a05c08ac0674098e4c3f82b1fad4af5482b18d3"} Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.201679 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.201734 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.220004 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-894d58c65-zbm4r" podStartSLOduration=3.21997968 podStartE2EDuration="3.21997968s" podCreationTimestamp="2025-12-05 20:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:25:02.215987587 +0000 UTC m=+1167.512803248" watchObservedRunningTime="2025-12-05 20:25:02.21997968 +0000 UTC m=+1167.516795341" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.237364 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5ff897b49b-mz22t" podStartSLOduration=3.237346194 podStartE2EDuration="3.237346194s" podCreationTimestamp="2025-12-05 20:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:25:02.233544137 +0000 UTC m=+1167.530359798" watchObservedRunningTime="2025-12-05 20:25:02.237346194 +0000 UTC m=+1167.534161855" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.329289 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.330606 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.331679 4885 scope.go:117] "RemoveContainer" containerID="887edbc799e9e64a2fcfe7e14853a2d60577d8dd493d4919b225740db949ad6f" Dec 05 20:25:02 crc kubenswrapper[4885]: E1205 20:25:02.332071 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-6dd697974b-njsvr_openstack(2037cb2f-46ad-4a89-b430-91dd3568954f)\"" pod="openstack/neutron-6dd697974b-njsvr" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.342360 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-6dd697974b-njsvr" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerName="neutron-api" probeResult="failure" output="Get \"http://10.217.0.148:9696/\": dial tcp 10.217.0.148:9696: connect: connection refused" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.721174 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-787956bb96-gzkln"] Dec 05 20:25:02 crc kubenswrapper[4885]: E1205 20:25:02.721556 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b800e0bf-89cc-47a6-8984-63b35e27d593" containerName="dnsmasq-dns" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.721572 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b800e0bf-89cc-47a6-8984-63b35e27d593" containerName="dnsmasq-dns" Dec 05 20:25:02 crc kubenswrapper[4885]: E1205 20:25:02.721597 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b800e0bf-89cc-47a6-8984-63b35e27d593" containerName="init" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.721603 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b800e0bf-89cc-47a6-8984-63b35e27d593" containerName="init" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.721793 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b800e0bf-89cc-47a6-8984-63b35e27d593" containerName="dnsmasq-dns" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.722784 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.728226 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.740178 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-787956bb96-gzkln"] Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.740469 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.811601 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af09c24-8a48-47cc-ad7c-1778f9a27547-config-data\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.811646 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8af09c24-8a48-47cc-ad7c-1778f9a27547-config-data-custom\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.811670 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8af09c24-8a48-47cc-ad7c-1778f9a27547-public-tls-certs\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.811926 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af09c24-8a48-47cc-ad7c-1778f9a27547-combined-ca-bundle\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.812070 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af09c24-8a48-47cc-ad7c-1778f9a27547-logs\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.812113 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkhqc\" (UniqueName: \"kubernetes.io/projected/8af09c24-8a48-47cc-ad7c-1778f9a27547-kube-api-access-fkhqc\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.812137 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8af09c24-8a48-47cc-ad7c-1778f9a27547-internal-tls-certs\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.913096 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af09c24-8a48-47cc-ad7c-1778f9a27547-logs\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.913131 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkhqc\" (UniqueName: \"kubernetes.io/projected/8af09c24-8a48-47cc-ad7c-1778f9a27547-kube-api-access-fkhqc\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.913154 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8af09c24-8a48-47cc-ad7c-1778f9a27547-internal-tls-certs\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.913206 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af09c24-8a48-47cc-ad7c-1778f9a27547-config-data\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.913230 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8af09c24-8a48-47cc-ad7c-1778f9a27547-config-data-custom\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.913250 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8af09c24-8a48-47cc-ad7c-1778f9a27547-public-tls-certs\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.913318 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af09c24-8a48-47cc-ad7c-1778f9a27547-combined-ca-bundle\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.914914 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af09c24-8a48-47cc-ad7c-1778f9a27547-logs\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.920158 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8af09c24-8a48-47cc-ad7c-1778f9a27547-public-tls-certs\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.925399 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af09c24-8a48-47cc-ad7c-1778f9a27547-combined-ca-bundle\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.925786 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af09c24-8a48-47cc-ad7c-1778f9a27547-config-data\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.925988 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8af09c24-8a48-47cc-ad7c-1778f9a27547-internal-tls-certs\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.928633 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8af09c24-8a48-47cc-ad7c-1778f9a27547-config-data-custom\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:02 crc kubenswrapper[4885]: I1205 20:25:02.944208 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkhqc\" (UniqueName: \"kubernetes.io/projected/8af09c24-8a48-47cc-ad7c-1778f9a27547-kube-api-access-fkhqc\") pod \"barbican-api-787956bb96-gzkln\" (UID: \"8af09c24-8a48-47cc-ad7c-1778f9a27547\") " pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:03 crc kubenswrapper[4885]: I1205 20:25:03.081798 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:03 crc kubenswrapper[4885]: I1205 20:25:03.188482 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b800e0bf-89cc-47a6-8984-63b35e27d593" path="/var/lib/kubelet/pods/b800e0bf-89cc-47a6-8984-63b35e27d593/volumes" Dec 05 20:25:03 crc kubenswrapper[4885]: I1205 20:25:03.226429 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" event={"ID":"84964967-0f37-47c8-919f-3a68040a1d36","Type":"ContainerStarted","Data":"abf5f44b80eba7a1f5e931fcd934d7f37e13c8fa5b8e0e3cd0adaa10f67f52e7"} Dec 05 20:25:03 crc kubenswrapper[4885]: I1205 20:25:03.260596 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"232e06c4-ecaf-4959-b1e2-0c183f6afb64","Type":"ContainerStarted","Data":"2c0be9de7a1d120cab04a70a37dde8fd888d285c33fd73eb7b307cb991eb6566"} Dec 05 20:25:03 crc kubenswrapper[4885]: I1205 20:25:03.571481 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-787956bb96-gzkln"] Dec 05 20:25:03 crc kubenswrapper[4885]: I1205 20:25:03.624372 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 20:25:03 crc kubenswrapper[4885]: I1205 20:25:03.669696 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:25:04 crc kubenswrapper[4885]: I1205 20:25:04.576854 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-549d6dd897-jd542" event={"ID":"f10197ca-8886-4668-b3e8-1179bdb7041d","Type":"ContainerStarted","Data":"2216253c0ef67382f3d25c97d929ab635449d8b067739baf02648bd1b78d0be9"} Dec 05 20:25:04 crc kubenswrapper[4885]: I1205 20:25:04.577441 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-549d6dd897-jd542" event={"ID":"f10197ca-8886-4668-b3e8-1179bdb7041d","Type":"ContainerStarted","Data":"0e4dad1ae620d9b725adf5f69dbf17821655c5fbd3bb349bdd6555f62ea569ec"} Dec 05 20:25:04 crc kubenswrapper[4885]: I1205 20:25:04.579436 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-94b44cc8f-5tpnj" Dec 05 20:25:04 crc kubenswrapper[4885]: I1205 20:25:04.607109 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"232e06c4-ecaf-4959-b1e2-0c183f6afb64","Type":"ContainerStarted","Data":"93ad64860fc3411e49f30977e24294001af1dd9e7507e7ea7b113a45fc08e855"} Dec 05 20:25:04 crc kubenswrapper[4885]: I1205 20:25:04.607958 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 20:25:04 crc kubenswrapper[4885]: I1205 20:25:04.613840 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-549d6dd897-jd542" podStartSLOduration=3.367734725 podStartE2EDuration="5.61381865s" podCreationTimestamp="2025-12-05 20:24:59 +0000 UTC" firstStartedPulling="2025-12-05 20:25:00.634599438 +0000 UTC m=+1165.931415099" lastFinishedPulling="2025-12-05 20:25:02.880683363 +0000 UTC m=+1168.177499024" observedRunningTime="2025-12-05 20:25:04.608188305 +0000 UTC m=+1169.905003966" watchObservedRunningTime="2025-12-05 20:25:04.61381865 +0000 UTC m=+1169.910634311" Dec 05 20:25:04 crc kubenswrapper[4885]: I1205 20:25:04.629790 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-787956bb96-gzkln" event={"ID":"8af09c24-8a48-47cc-ad7c-1778f9a27547","Type":"ContainerStarted","Data":"bfb89cabe3dd02eaea0852ce02eaddf5a36aa9935a433823802030a226080f6a"} Dec 05 20:25:04 crc kubenswrapper[4885]: I1205 20:25:04.629888 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-787956bb96-gzkln" event={"ID":"8af09c24-8a48-47cc-ad7c-1778f9a27547","Type":"ContainerStarted","Data":"ef9fcab406bc75a56916e1ed9c8a8508ce381695de5190f6ccc86802f7bc8e1d"} Dec 05 20:25:04 crc kubenswrapper[4885]: I1205 20:25:04.637515 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" event={"ID":"84964967-0f37-47c8-919f-3a68040a1d36","Type":"ContainerStarted","Data":"47119c8da14ae47c8bceda31938ce3850c6dd40e442d219d22fb9eefaeaceee9"} Dec 05 20:25:04 crc kubenswrapper[4885]: I1205 20:25:04.638593 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="46e5a3b1-b389-45b0-a539-7197ce0b9b4e" containerName="cinder-scheduler" containerID="cri-o://b41ea50d27ecba2edfa16fca65321dff4b320dd87d8d8ac6979364c3c888372d" gracePeriod=30 Dec 05 20:25:04 crc kubenswrapper[4885]: I1205 20:25:04.638654 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="46e5a3b1-b389-45b0-a539-7197ce0b9b4e" containerName="probe" containerID="cri-o://47b42e7d0f9fe620276453c4bf580e527f85a60f2406dbf360b083caaf8ce747" gracePeriod=30 Dec 05 20:25:04 crc kubenswrapper[4885]: I1205 20:25:04.724669 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.724645525 podStartE2EDuration="4.724645525s" podCreationTimestamp="2025-12-05 20:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:25:04.675864295 +0000 UTC m=+1169.972679966" watchObservedRunningTime="2025-12-05 20:25:04.724645525 +0000 UTC m=+1170.021461186" Dec 05 20:25:04 crc kubenswrapper[4885]: I1205 20:25:04.778032 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-f44776d88-2k4qb" podStartSLOduration=3.556749478 podStartE2EDuration="5.777999659s" podCreationTimestamp="2025-12-05 20:24:59 +0000 UTC" firstStartedPulling="2025-12-05 20:25:00.658489833 +0000 UTC m=+1165.955305494" lastFinishedPulling="2025-12-05 20:25:02.879740014 +0000 UTC m=+1168.176555675" observedRunningTime="2025-12-05 20:25:04.715767208 +0000 UTC m=+1170.012582859" watchObservedRunningTime="2025-12-05 20:25:04.777999659 +0000 UTC m=+1170.074815320" Dec 05 20:25:04 crc kubenswrapper[4885]: I1205 20:25:04.778079 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6dd697974b-njsvr"] Dec 05 20:25:04 crc kubenswrapper[4885]: I1205 20:25:04.778478 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6dd697974b-njsvr" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerName="neutron-api" containerID="cri-o://24d7e5c52698dcceb0e5a78c1a2123b1e1bacbf374c670ffc139597735ac4ffa" gracePeriod=30 Dec 05 20:25:05 crc kubenswrapper[4885]: I1205 20:25:05.653166 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-787956bb96-gzkln" event={"ID":"8af09c24-8a48-47cc-ad7c-1778f9a27547","Type":"ContainerStarted","Data":"e6d52fbe894162c464082f543a3cfe9e7c51c6ddb22c50b9e86ca9ef51ee4345"} Dec 05 20:25:05 crc kubenswrapper[4885]: I1205 20:25:05.653731 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:05 crc kubenswrapper[4885]: I1205 20:25:05.655218 4885 generic.go:334] "Generic (PLEG): container finished" podID="46e5a3b1-b389-45b0-a539-7197ce0b9b4e" containerID="47b42e7d0f9fe620276453c4bf580e527f85a60f2406dbf360b083caaf8ce747" exitCode=0 Dec 05 20:25:05 crc kubenswrapper[4885]: I1205 20:25:05.655335 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"46e5a3b1-b389-45b0-a539-7197ce0b9b4e","Type":"ContainerDied","Data":"47b42e7d0f9fe620276453c4bf580e527f85a60f2406dbf360b083caaf8ce747"} Dec 05 20:25:05 crc kubenswrapper[4885]: I1205 20:25:05.672610 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-787956bb96-gzkln" podStartSLOduration=3.672595473 podStartE2EDuration="3.672595473s" podCreationTimestamp="2025-12-05 20:25:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:25:05.66896588 +0000 UTC m=+1170.965781541" watchObservedRunningTime="2025-12-05 20:25:05.672595473 +0000 UTC m=+1170.969411134" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.214315 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dd697974b-njsvr_2037cb2f-46ad-4a89-b430-91dd3568954f/neutron-httpd/2.log" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.215014 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.396159 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbcsg\" (UniqueName: \"kubernetes.io/projected/2037cb2f-46ad-4a89-b430-91dd3568954f-kube-api-access-kbcsg\") pod \"2037cb2f-46ad-4a89-b430-91dd3568954f\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.396320 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-config\") pod \"2037cb2f-46ad-4a89-b430-91dd3568954f\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.396359 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-ovndb-tls-certs\") pod \"2037cb2f-46ad-4a89-b430-91dd3568954f\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.396378 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-combined-ca-bundle\") pod \"2037cb2f-46ad-4a89-b430-91dd3568954f\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.396566 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-httpd-config\") pod \"2037cb2f-46ad-4a89-b430-91dd3568954f\" (UID: \"2037cb2f-46ad-4a89-b430-91dd3568954f\") " Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.402210 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2037cb2f-46ad-4a89-b430-91dd3568954f" (UID: "2037cb2f-46ad-4a89-b430-91dd3568954f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.402328 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2037cb2f-46ad-4a89-b430-91dd3568954f-kube-api-access-kbcsg" (OuterVolumeSpecName: "kube-api-access-kbcsg") pod "2037cb2f-46ad-4a89-b430-91dd3568954f" (UID: "2037cb2f-46ad-4a89-b430-91dd3568954f"). InnerVolumeSpecName "kube-api-access-kbcsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.446012 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2037cb2f-46ad-4a89-b430-91dd3568954f" (UID: "2037cb2f-46ad-4a89-b430-91dd3568954f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.459784 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-config" (OuterVolumeSpecName: "config") pod "2037cb2f-46ad-4a89-b430-91dd3568954f" (UID: "2037cb2f-46ad-4a89-b430-91dd3568954f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.474333 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2037cb2f-46ad-4a89-b430-91dd3568954f" (UID: "2037cb2f-46ad-4a89-b430-91dd3568954f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.498424 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.498469 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbcsg\" (UniqueName: \"kubernetes.io/projected/2037cb2f-46ad-4a89-b430-91dd3568954f-kube-api-access-kbcsg\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.498481 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.498489 4885 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.498499 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2037cb2f-46ad-4a89-b430-91dd3568954f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.667186 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dd697974b-njsvr_2037cb2f-46ad-4a89-b430-91dd3568954f/neutron-httpd/2.log" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.667677 4885 generic.go:334] "Generic (PLEG): container finished" podID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerID="24d7e5c52698dcceb0e5a78c1a2123b1e1bacbf374c670ffc139597735ac4ffa" exitCode=0 Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.667763 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dd697974b-njsvr" event={"ID":"2037cb2f-46ad-4a89-b430-91dd3568954f","Type":"ContainerDied","Data":"24d7e5c52698dcceb0e5a78c1a2123b1e1bacbf374c670ffc139597735ac4ffa"} Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.667792 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dd697974b-njsvr" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.667819 4885 scope.go:117] "RemoveContainer" containerID="887edbc799e9e64a2fcfe7e14853a2d60577d8dd493d4919b225740db949ad6f" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.667803 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dd697974b-njsvr" event={"ID":"2037cb2f-46ad-4a89-b430-91dd3568954f","Type":"ContainerDied","Data":"8d3b72e6e0e72f68b71baed20e96d8fe31690cae4d9d5608f4d79c762ab6d106"} Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.668007 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.698841 4885 scope.go:117] "RemoveContainer" containerID="24d7e5c52698dcceb0e5a78c1a2123b1e1bacbf374c670ffc139597735ac4ffa" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.755197 4885 scope.go:117] "RemoveContainer" containerID="887edbc799e9e64a2fcfe7e14853a2d60577d8dd493d4919b225740db949ad6f" Dec 05 20:25:06 crc kubenswrapper[4885]: E1205 20:25:06.755673 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"887edbc799e9e64a2fcfe7e14853a2d60577d8dd493d4919b225740db949ad6f\": container with ID starting with 887edbc799e9e64a2fcfe7e14853a2d60577d8dd493d4919b225740db949ad6f not found: ID does not exist" containerID="887edbc799e9e64a2fcfe7e14853a2d60577d8dd493d4919b225740db949ad6f" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.755714 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"887edbc799e9e64a2fcfe7e14853a2d60577d8dd493d4919b225740db949ad6f"} err="failed to get container status \"887edbc799e9e64a2fcfe7e14853a2d60577d8dd493d4919b225740db949ad6f\": rpc error: code = NotFound desc = could not find container \"887edbc799e9e64a2fcfe7e14853a2d60577d8dd493d4919b225740db949ad6f\": container with ID starting with 887edbc799e9e64a2fcfe7e14853a2d60577d8dd493d4919b225740db949ad6f not found: ID does not exist" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.755742 4885 scope.go:117] "RemoveContainer" containerID="24d7e5c52698dcceb0e5a78c1a2123b1e1bacbf374c670ffc139597735ac4ffa" Dec 05 20:25:06 crc kubenswrapper[4885]: E1205 20:25:06.756031 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24d7e5c52698dcceb0e5a78c1a2123b1e1bacbf374c670ffc139597735ac4ffa\": container with ID starting with 24d7e5c52698dcceb0e5a78c1a2123b1e1bacbf374c670ffc139597735ac4ffa not found: ID does not exist" containerID="24d7e5c52698dcceb0e5a78c1a2123b1e1bacbf374c670ffc139597735ac4ffa" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.756065 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24d7e5c52698dcceb0e5a78c1a2123b1e1bacbf374c670ffc139597735ac4ffa"} err="failed to get container status \"24d7e5c52698dcceb0e5a78c1a2123b1e1bacbf374c670ffc139597735ac4ffa\": rpc error: code = NotFound desc = could not find container \"24d7e5c52698dcceb0e5a78c1a2123b1e1bacbf374c670ffc139597735ac4ffa\": container with ID starting with 24d7e5c52698dcceb0e5a78c1a2123b1e1bacbf374c670ffc139597735ac4ffa not found: ID does not exist" Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.758206 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6dd697974b-njsvr"] Dec 05 20:25:06 crc kubenswrapper[4885]: I1205 20:25:06.767139 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6dd697974b-njsvr"] Dec 05 20:25:07 crc kubenswrapper[4885]: I1205 20:25:07.185234 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" path="/var/lib/kubelet/pods/2037cb2f-46ad-4a89-b430-91dd3568954f/volumes" Dec 05 20:25:07 crc kubenswrapper[4885]: I1205 20:25:07.641215 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:07 crc kubenswrapper[4885]: I1205 20:25:07.641319 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:07 crc kubenswrapper[4885]: I1205 20:25:07.751714 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:07 crc kubenswrapper[4885]: I1205 20:25:07.817398 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:08 crc kubenswrapper[4885]: I1205 20:25:08.448542 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 20:25:08 crc kubenswrapper[4885]: I1205 20:25:08.448640 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 20:25:08 crc kubenswrapper[4885]: I1205 20:25:08.483720 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 20:25:08 crc kubenswrapper[4885]: I1205 20:25:08.503006 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 20:25:08 crc kubenswrapper[4885]: I1205 20:25:08.738245 4885 generic.go:334] "Generic (PLEG): container finished" podID="46e5a3b1-b389-45b0-a539-7197ce0b9b4e" containerID="b41ea50d27ecba2edfa16fca65321dff4b320dd87d8d8ac6979364c3c888372d" exitCode=0 Dec 05 20:25:08 crc kubenswrapper[4885]: I1205 20:25:08.738321 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"46e5a3b1-b389-45b0-a539-7197ce0b9b4e","Type":"ContainerDied","Data":"b41ea50d27ecba2edfa16fca65321dff4b320dd87d8d8ac6979364c3c888372d"} Dec 05 20:25:08 crc kubenswrapper[4885]: I1205 20:25:08.738625 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:08 crc kubenswrapper[4885]: I1205 20:25:08.738659 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:08 crc kubenswrapper[4885]: I1205 20:25:08.738668 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 20:25:08 crc kubenswrapper[4885]: I1205 20:25:08.738677 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 20:25:08 crc kubenswrapper[4885]: I1205 20:25:08.748227 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.133100 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.227445 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.257778 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g96nt\" (UniqueName: \"kubernetes.io/projected/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-kube-api-access-g96nt\") pod \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.257857 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-scripts\") pod \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.257954 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-etc-machine-id\") pod \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.257977 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-combined-ca-bundle\") pod \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.258046 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-config-data\") pod \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.258164 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-config-data-custom\") pod \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\" (UID: \"46e5a3b1-b389-45b0-a539-7197ce0b9b4e\") " Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.267133 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "46e5a3b1-b389-45b0-a539-7197ce0b9b4e" (UID: "46e5a3b1-b389-45b0-a539-7197ce0b9b4e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.271241 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-kube-api-access-g96nt" (OuterVolumeSpecName: "kube-api-access-g96nt") pod "46e5a3b1-b389-45b0-a539-7197ce0b9b4e" (UID: "46e5a3b1-b389-45b0-a539-7197ce0b9b4e"). InnerVolumeSpecName "kube-api-access-g96nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.279078 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-scripts" (OuterVolumeSpecName: "scripts") pod "46e5a3b1-b389-45b0-a539-7197ce0b9b4e" (UID: "46e5a3b1-b389-45b0-a539-7197ce0b9b4e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.295266 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "46e5a3b1-b389-45b0-a539-7197ce0b9b4e" (UID: "46e5a3b1-b389-45b0-a539-7197ce0b9b4e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.346065 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46e5a3b1-b389-45b0-a539-7197ce0b9b4e" (UID: "46e5a3b1-b389-45b0-a539-7197ce0b9b4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.359987 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.361521 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g96nt\" (UniqueName: \"kubernetes.io/projected/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-kube-api-access-g96nt\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.361564 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.361575 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.361584 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.425006 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-config-data" (OuterVolumeSpecName: "config-data") pod "46e5a3b1-b389-45b0-a539-7197ce0b9b4e" (UID: "46e5a3b1-b389-45b0-a539-7197ce0b9b4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.462536 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e5a3b1-b389-45b0-a539-7197ce0b9b4e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.748919 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"46e5a3b1-b389-45b0-a539-7197ce0b9b4e","Type":"ContainerDied","Data":"54b477bee180673cd7896114bc320c2eb91386652ac31f8a3759127983228a13"} Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.748944 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.748975 4885 scope.go:117] "RemoveContainer" containerID="47b42e7d0f9fe620276453c4bf580e527f85a60f2406dbf360b083caaf8ce747" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.788783 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.802576 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.824282 4885 scope.go:117] "RemoveContainer" containerID="b41ea50d27ecba2edfa16fca65321dff4b320dd87d8d8ac6979364c3c888372d" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.829839 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:25:09 crc kubenswrapper[4885]: E1205 20:25:09.830169 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerName="neutron-httpd" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.830184 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerName="neutron-httpd" Dec 05 20:25:09 crc kubenswrapper[4885]: E1205 20:25:09.830197 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerName="neutron-api" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.830205 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerName="neutron-api" Dec 05 20:25:09 crc kubenswrapper[4885]: E1205 20:25:09.830221 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46e5a3b1-b389-45b0-a539-7197ce0b9b4e" containerName="probe" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.830228 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="46e5a3b1-b389-45b0-a539-7197ce0b9b4e" containerName="probe" Dec 05 20:25:09 crc kubenswrapper[4885]: E1205 20:25:09.830249 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerName="neutron-httpd" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.830255 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerName="neutron-httpd" Dec 05 20:25:09 crc kubenswrapper[4885]: E1205 20:25:09.830273 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46e5a3b1-b389-45b0-a539-7197ce0b9b4e" containerName="cinder-scheduler" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.830279 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="46e5a3b1-b389-45b0-a539-7197ce0b9b4e" containerName="cinder-scheduler" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.830431 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerName="neutron-httpd" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.830444 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerName="neutron-httpd" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.830455 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerName="neutron-httpd" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.830465 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="46e5a3b1-b389-45b0-a539-7197ce0b9b4e" containerName="cinder-scheduler" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.830484 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerName="neutron-api" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.830497 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="46e5a3b1-b389-45b0-a539-7197ce0b9b4e" containerName="probe" Dec 05 20:25:09 crc kubenswrapper[4885]: E1205 20:25:09.830668 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerName="neutron-httpd" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.830680 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2037cb2f-46ad-4a89-b430-91dd3568954f" containerName="neutron-httpd" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.831446 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.835485 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.836207 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.906241 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.974566 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54d9b68659-r2zdz"] Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.974824 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" podUID="8a0bed2d-1fb8-4e60-8d5b-a468aab8985b" containerName="dnsmasq-dns" containerID="cri-o://f9fef873318696b70a25f2d57ffb86aba4a6e93dfa65a84129702236af8bb663" gracePeriod=10 Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.976203 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85ff2041-1a3f-46c9-ba86-9440a4c1e129-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.976367 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85ff2041-1a3f-46c9-ba86-9440a4c1e129-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.976423 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ff2041-1a3f-46c9-ba86-9440a4c1e129-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.976499 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ff2041-1a3f-46c9-ba86-9440a4c1e129-config-data\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.976556 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ff2041-1a3f-46c9-ba86-9440a4c1e129-scripts\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:09 crc kubenswrapper[4885]: I1205 20:25:09.976597 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfmq9\" (UniqueName: \"kubernetes.io/projected/85ff2041-1a3f-46c9-ba86-9440a4c1e129-kube-api-access-vfmq9\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.082923 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ff2041-1a3f-46c9-ba86-9440a4c1e129-config-data\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.082971 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ff2041-1a3f-46c9-ba86-9440a4c1e129-scripts\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.083640 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfmq9\" (UniqueName: \"kubernetes.io/projected/85ff2041-1a3f-46c9-ba86-9440a4c1e129-kube-api-access-vfmq9\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.083691 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85ff2041-1a3f-46c9-ba86-9440a4c1e129-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.083773 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85ff2041-1a3f-46c9-ba86-9440a4c1e129-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.083819 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ff2041-1a3f-46c9-ba86-9440a4c1e129-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.084274 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85ff2041-1a3f-46c9-ba86-9440a4c1e129-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.106573 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ff2041-1a3f-46c9-ba86-9440a4c1e129-scripts\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.106709 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ff2041-1a3f-46c9-ba86-9440a4c1e129-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.107354 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ff2041-1a3f-46c9-ba86-9440a4c1e129-config-data\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.111733 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85ff2041-1a3f-46c9-ba86-9440a4c1e129-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.127756 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfmq9\" (UniqueName: \"kubernetes.io/projected/85ff2041-1a3f-46c9-ba86-9440a4c1e129-kube-api-access-vfmq9\") pod \"cinder-scheduler-0\" (UID: \"85ff2041-1a3f-46c9-ba86-9440a4c1e129\") " pod="openstack/cinder-scheduler-0" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.164531 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.620341 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.717797 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46zc7\" (UniqueName: \"kubernetes.io/projected/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-kube-api-access-46zc7\") pod \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.717862 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-dns-swift-storage-0\") pod \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.717887 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-dns-svc\") pod \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.717916 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-ovsdbserver-nb\") pod \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.717947 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-config\") pod \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.718045 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-ovsdbserver-sb\") pod \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.748171 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-kube-api-access-46zc7" (OuterVolumeSpecName: "kube-api-access-46zc7") pod "8a0bed2d-1fb8-4e60-8d5b-a468aab8985b" (UID: "8a0bed2d-1fb8-4e60-8d5b-a468aab8985b"). InnerVolumeSpecName "kube-api-access-46zc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.819836 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46zc7\" (UniqueName: \"kubernetes.io/projected/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-kube-api-access-46zc7\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.822573 4885 generic.go:334] "Generic (PLEG): container finished" podID="8a0bed2d-1fb8-4e60-8d5b-a468aab8985b" containerID="f9fef873318696b70a25f2d57ffb86aba4a6e93dfa65a84129702236af8bb663" exitCode=0 Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.822609 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" event={"ID":"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b","Type":"ContainerDied","Data":"f9fef873318696b70a25f2d57ffb86aba4a6e93dfa65a84129702236af8bb663"} Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.822633 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" event={"ID":"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b","Type":"ContainerDied","Data":"49b881bc8876759a5895dab48d2450d1d5d5bb5e21260ae081ea1bf19108b228"} Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.822641 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54d9b68659-r2zdz" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.822648 4885 scope.go:117] "RemoveContainer" containerID="f9fef873318696b70a25f2d57ffb86aba4a6e93dfa65a84129702236af8bb663" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.832856 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 20:25:10 crc kubenswrapper[4885]: W1205 20:25:10.858372 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85ff2041_1a3f_46c9_ba86_9440a4c1e129.slice/crio-70fe6bd2001da8e03e4cf72292eab024e864e628ec41165e48cf4fee318a3728 WatchSource:0}: Error finding container 70fe6bd2001da8e03e4cf72292eab024e864e628ec41165e48cf4fee318a3728: Status 404 returned error can't find the container with id 70fe6bd2001da8e03e4cf72292eab024e864e628ec41165e48cf4fee318a3728 Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.897788 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8a0bed2d-1fb8-4e60-8d5b-a468aab8985b" (UID: "8a0bed2d-1fb8-4e60-8d5b-a468aab8985b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.926693 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8a0bed2d-1fb8-4e60-8d5b-a468aab8985b" (UID: "8a0bed2d-1fb8-4e60-8d5b-a468aab8985b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.927615 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-config" (OuterVolumeSpecName: "config") pod "8a0bed2d-1fb8-4e60-8d5b-a468aab8985b" (UID: "8a0bed2d-1fb8-4e60-8d5b-a468aab8985b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.937456 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-config\") pod \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\" (UID: \"8a0bed2d-1fb8-4e60-8d5b-a468aab8985b\") " Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.938241 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.938254 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:10 crc kubenswrapper[4885]: W1205 20:25:10.938327 4885 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b/volumes/kubernetes.io~configmap/config Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.938341 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-config" (OuterVolumeSpecName: "config") pod "8a0bed2d-1fb8-4e60-8d5b-a468aab8985b" (UID: "8a0bed2d-1fb8-4e60-8d5b-a468aab8985b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.961457 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8a0bed2d-1fb8-4e60-8d5b-a468aab8985b" (UID: "8a0bed2d-1fb8-4e60-8d5b-a468aab8985b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:10 crc kubenswrapper[4885]: I1205 20:25:10.969955 4885 scope.go:117] "RemoveContainer" containerID="3cf8f958c8e116487ec9ee4b025648eb7d764644432db690bd070e7f5f9d246f" Dec 05 20:25:11 crc kubenswrapper[4885]: I1205 20:25:11.014624 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8a0bed2d-1fb8-4e60-8d5b-a468aab8985b" (UID: "8a0bed2d-1fb8-4e60-8d5b-a468aab8985b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:11 crc kubenswrapper[4885]: I1205 20:25:11.026868 4885 scope.go:117] "RemoveContainer" containerID="f9fef873318696b70a25f2d57ffb86aba4a6e93dfa65a84129702236af8bb663" Dec 05 20:25:11 crc kubenswrapper[4885]: E1205 20:25:11.028173 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9fef873318696b70a25f2d57ffb86aba4a6e93dfa65a84129702236af8bb663\": container with ID starting with f9fef873318696b70a25f2d57ffb86aba4a6e93dfa65a84129702236af8bb663 not found: ID does not exist" containerID="f9fef873318696b70a25f2d57ffb86aba4a6e93dfa65a84129702236af8bb663" Dec 05 20:25:11 crc kubenswrapper[4885]: I1205 20:25:11.028230 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9fef873318696b70a25f2d57ffb86aba4a6e93dfa65a84129702236af8bb663"} err="failed to get container status \"f9fef873318696b70a25f2d57ffb86aba4a6e93dfa65a84129702236af8bb663\": rpc error: code = NotFound desc = could not find container \"f9fef873318696b70a25f2d57ffb86aba4a6e93dfa65a84129702236af8bb663\": container with ID starting with f9fef873318696b70a25f2d57ffb86aba4a6e93dfa65a84129702236af8bb663 not found: ID does not exist" Dec 05 20:25:11 crc kubenswrapper[4885]: I1205 20:25:11.028251 4885 scope.go:117] "RemoveContainer" containerID="3cf8f958c8e116487ec9ee4b025648eb7d764644432db690bd070e7f5f9d246f" Dec 05 20:25:11 crc kubenswrapper[4885]: E1205 20:25:11.030344 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cf8f958c8e116487ec9ee4b025648eb7d764644432db690bd070e7f5f9d246f\": container with ID starting with 3cf8f958c8e116487ec9ee4b025648eb7d764644432db690bd070e7f5f9d246f not found: ID does not exist" containerID="3cf8f958c8e116487ec9ee4b025648eb7d764644432db690bd070e7f5f9d246f" Dec 05 20:25:11 crc kubenswrapper[4885]: I1205 20:25:11.030397 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cf8f958c8e116487ec9ee4b025648eb7d764644432db690bd070e7f5f9d246f"} err="failed to get container status \"3cf8f958c8e116487ec9ee4b025648eb7d764644432db690bd070e7f5f9d246f\": rpc error: code = NotFound desc = could not find container \"3cf8f958c8e116487ec9ee4b025648eb7d764644432db690bd070e7f5f9d246f\": container with ID starting with 3cf8f958c8e116487ec9ee4b025648eb7d764644432db690bd070e7f5f9d246f not found: ID does not exist" Dec 05 20:25:11 crc kubenswrapper[4885]: I1205 20:25:11.040478 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:11 crc kubenswrapper[4885]: I1205 20:25:11.040733 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:11 crc kubenswrapper[4885]: I1205 20:25:11.040828 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:11 crc kubenswrapper[4885]: I1205 20:25:11.205189 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46e5a3b1-b389-45b0-a539-7197ce0b9b4e" path="/var/lib/kubelet/pods/46e5a3b1-b389-45b0-a539-7197ce0b9b4e/volumes" Dec 05 20:25:11 crc kubenswrapper[4885]: I1205 20:25:11.206000 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54d9b68659-r2zdz"] Dec 05 20:25:11 crc kubenswrapper[4885]: I1205 20:25:11.206042 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54d9b68659-r2zdz"] Dec 05 20:25:11 crc kubenswrapper[4885]: I1205 20:25:11.737450 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:11 crc kubenswrapper[4885]: I1205 20:25:11.737868 4885 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:25:11 crc kubenswrapper[4885]: I1205 20:25:11.848512 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"85ff2041-1a3f-46c9-ba86-9440a4c1e129","Type":"ContainerStarted","Data":"90f5507ecdb7cd0f72f640f79dab083497e8c9c67c8d9e7c4119bcb2e998f36b"} Dec 05 20:25:11 crc kubenswrapper[4885]: I1205 20:25:11.848557 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"85ff2041-1a3f-46c9-ba86-9440a4c1e129","Type":"ContainerStarted","Data":"70fe6bd2001da8e03e4cf72292eab024e864e628ec41165e48cf4fee318a3728"} Dec 05 20:25:12 crc kubenswrapper[4885]: I1205 20:25:12.056815 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 20:25:12 crc kubenswrapper[4885]: I1205 20:25:12.057144 4885 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:25:12 crc kubenswrapper[4885]: I1205 20:25:12.074569 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:12 crc kubenswrapper[4885]: I1205 20:25:12.364338 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:25:12 crc kubenswrapper[4885]: I1205 20:25:12.583430 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:25:12 crc kubenswrapper[4885]: I1205 20:25:12.687579 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7d9999949d-c22ch" Dec 05 20:25:12 crc kubenswrapper[4885]: I1205 20:25:12.765201 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7ddb869454-vvfd9"] Dec 05 20:25:12 crc kubenswrapper[4885]: I1205 20:25:12.864905 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7ddb869454-vvfd9" podUID="58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" containerName="horizon-log" containerID="cri-o://c4d686985a3af471508ab1b5d0a4c3ed14ad0ec2a8a4399057c6c1c976215e97" gracePeriod=30 Dec 05 20:25:12 crc kubenswrapper[4885]: I1205 20:25:12.865272 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"85ff2041-1a3f-46c9-ba86-9440a4c1e129","Type":"ContainerStarted","Data":"d6a5f708c3f50923707cfc630a053055022e3ed087382e191e755ed9257f04a1"} Dec 05 20:25:12 crc kubenswrapper[4885]: I1205 20:25:12.865340 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7ddb869454-vvfd9" podUID="58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" containerName="horizon" containerID="cri-o://7d4897e7e9fe34f5c8e863c727990aaf2e3ffa96de3ab3cb8b2927f061b528b5" gracePeriod=30 Dec 05 20:25:12 crc kubenswrapper[4885]: I1205 20:25:12.887760 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.887743824 podStartE2EDuration="3.887743824s" podCreationTimestamp="2025-12-05 20:25:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:25:12.88693888 +0000 UTC m=+1178.183754541" watchObservedRunningTime="2025-12-05 20:25:12.887743824 +0000 UTC m=+1178.184559485" Dec 05 20:25:13 crc kubenswrapper[4885]: I1205 20:25:13.195227 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a0bed2d-1fb8-4e60-8d5b-a468aab8985b" path="/var/lib/kubelet/pods/8a0bed2d-1fb8-4e60-8d5b-a468aab8985b/volumes" Dec 05 20:25:13 crc kubenswrapper[4885]: I1205 20:25:13.361222 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:13 crc kubenswrapper[4885]: I1205 20:25:13.545922 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 20:25:14 crc kubenswrapper[4885]: I1205 20:25:14.291316 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 05 20:25:14 crc kubenswrapper[4885]: I1205 20:25:14.901858 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:25:15 crc kubenswrapper[4885]: I1205 20:25:15.165908 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 20:25:15 crc kubenswrapper[4885]: I1205 20:25:15.365606 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-787956bb96-gzkln" Dec 05 20:25:15 crc kubenswrapper[4885]: I1205 20:25:15.418352 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5ff897b49b-mz22t"] Dec 05 20:25:15 crc kubenswrapper[4885]: I1205 20:25:15.892523 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5ff897b49b-mz22t" podUID="b6418540-35fb-49a0-8e02-8540c41d59f1" containerName="barbican-api-log" containerID="cri-o://b9476cb53586e59a9259b0918a05c08ac0674098e4c3f82b1fad4af5482b18d3" gracePeriod=30 Dec 05 20:25:15 crc kubenswrapper[4885]: I1205 20:25:15.892556 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5ff897b49b-mz22t" podUID="b6418540-35fb-49a0-8e02-8540c41d59f1" containerName="barbican-api" containerID="cri-o://3a0107ae0a3e4161f64caa2c0528076f60e8067d2eb31b066e23c73178dcbb9e" gracePeriod=30 Dec 05 20:25:15 crc kubenswrapper[4885]: I1205 20:25:15.899037 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5ff897b49b-mz22t" podUID="b6418540-35fb-49a0-8e02-8540c41d59f1" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": EOF" Dec 05 20:25:15 crc kubenswrapper[4885]: I1205 20:25:15.899852 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5ff897b49b-mz22t" podUID="b6418540-35fb-49a0-8e02-8540c41d59f1" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": EOF" Dec 05 20:25:15 crc kubenswrapper[4885]: I1205 20:25:15.899917 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5ff897b49b-mz22t" podUID="b6418540-35fb-49a0-8e02-8540c41d59f1" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": EOF" Dec 05 20:25:15 crc kubenswrapper[4885]: I1205 20:25:15.950606 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:25:16 crc kubenswrapper[4885]: I1205 20:25:16.013588 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7bdf6f4c4b-9n2vm" Dec 05 20:25:16 crc kubenswrapper[4885]: I1205 20:25:16.659132 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d98fd5798-8jhxf" Dec 05 20:25:16 crc kubenswrapper[4885]: I1205 20:25:16.874914 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7ddb869454-vvfd9" podUID="58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 05 20:25:16 crc kubenswrapper[4885]: I1205 20:25:16.911184 4885 generic.go:334] "Generic (PLEG): container finished" podID="58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" containerID="7d4897e7e9fe34f5c8e863c727990aaf2e3ffa96de3ab3cb8b2927f061b528b5" exitCode=0 Dec 05 20:25:16 crc kubenswrapper[4885]: I1205 20:25:16.911247 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ddb869454-vvfd9" event={"ID":"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a","Type":"ContainerDied","Data":"7d4897e7e9fe34f5c8e863c727990aaf2e3ffa96de3ab3cb8b2927f061b528b5"} Dec 05 20:25:16 crc kubenswrapper[4885]: I1205 20:25:16.914379 4885 generic.go:334] "Generic (PLEG): container finished" podID="b6418540-35fb-49a0-8e02-8540c41d59f1" containerID="b9476cb53586e59a9259b0918a05c08ac0674098e4c3f82b1fad4af5482b18d3" exitCode=143 Dec 05 20:25:16 crc kubenswrapper[4885]: I1205 20:25:16.914474 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5ff897b49b-mz22t" event={"ID":"b6418540-35fb-49a0-8e02-8540c41d59f1","Type":"ContainerDied","Data":"b9476cb53586e59a9259b0918a05c08ac0674098e4c3f82b1fad4af5482b18d3"} Dec 05 20:25:17 crc kubenswrapper[4885]: I1205 20:25:17.879144 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 20:25:17 crc kubenswrapper[4885]: E1205 20:25:17.879612 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0bed2d-1fb8-4e60-8d5b-a468aab8985b" containerName="dnsmasq-dns" Dec 05 20:25:17 crc kubenswrapper[4885]: I1205 20:25:17.879628 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0bed2d-1fb8-4e60-8d5b-a468aab8985b" containerName="dnsmasq-dns" Dec 05 20:25:17 crc kubenswrapper[4885]: E1205 20:25:17.879675 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0bed2d-1fb8-4e60-8d5b-a468aab8985b" containerName="init" Dec 05 20:25:17 crc kubenswrapper[4885]: I1205 20:25:17.879685 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0bed2d-1fb8-4e60-8d5b-a468aab8985b" containerName="init" Dec 05 20:25:17 crc kubenswrapper[4885]: I1205 20:25:17.879935 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0bed2d-1fb8-4e60-8d5b-a468aab8985b" containerName="dnsmasq-dns" Dec 05 20:25:17 crc kubenswrapper[4885]: I1205 20:25:17.880693 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 20:25:17 crc kubenswrapper[4885]: I1205 20:25:17.882793 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-2zks2" Dec 05 20:25:17 crc kubenswrapper[4885]: I1205 20:25:17.883383 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 05 20:25:17 crc kubenswrapper[4885]: I1205 20:25:17.888794 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 20:25:17 crc kubenswrapper[4885]: I1205 20:25:17.890228 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 05 20:25:17 crc kubenswrapper[4885]: I1205 20:25:17.992958 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b132f9-5036-44cd-8d19-e60a39760da0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"60b132f9-5036-44cd-8d19-e60a39760da0\") " pod="openstack/openstackclient" Dec 05 20:25:17 crc kubenswrapper[4885]: I1205 20:25:17.993254 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/60b132f9-5036-44cd-8d19-e60a39760da0-openstack-config-secret\") pod \"openstackclient\" (UID: \"60b132f9-5036-44cd-8d19-e60a39760da0\") " pod="openstack/openstackclient" Dec 05 20:25:17 crc kubenswrapper[4885]: I1205 20:25:17.993354 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plhjk\" (UniqueName: \"kubernetes.io/projected/60b132f9-5036-44cd-8d19-e60a39760da0-kube-api-access-plhjk\") pod \"openstackclient\" (UID: \"60b132f9-5036-44cd-8d19-e60a39760da0\") " pod="openstack/openstackclient" Dec 05 20:25:17 crc kubenswrapper[4885]: I1205 20:25:17.993448 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/60b132f9-5036-44cd-8d19-e60a39760da0-openstack-config\") pod \"openstackclient\" (UID: \"60b132f9-5036-44cd-8d19-e60a39760da0\") " pod="openstack/openstackclient" Dec 05 20:25:18 crc kubenswrapper[4885]: I1205 20:25:18.094549 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/60b132f9-5036-44cd-8d19-e60a39760da0-openstack-config\") pod \"openstackclient\" (UID: \"60b132f9-5036-44cd-8d19-e60a39760da0\") " pod="openstack/openstackclient" Dec 05 20:25:18 crc kubenswrapper[4885]: I1205 20:25:18.094898 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b132f9-5036-44cd-8d19-e60a39760da0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"60b132f9-5036-44cd-8d19-e60a39760da0\") " pod="openstack/openstackclient" Dec 05 20:25:18 crc kubenswrapper[4885]: I1205 20:25:18.095046 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/60b132f9-5036-44cd-8d19-e60a39760da0-openstack-config-secret\") pod \"openstackclient\" (UID: \"60b132f9-5036-44cd-8d19-e60a39760da0\") " pod="openstack/openstackclient" Dec 05 20:25:18 crc kubenswrapper[4885]: I1205 20:25:18.095152 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plhjk\" (UniqueName: \"kubernetes.io/projected/60b132f9-5036-44cd-8d19-e60a39760da0-kube-api-access-plhjk\") pod \"openstackclient\" (UID: \"60b132f9-5036-44cd-8d19-e60a39760da0\") " pod="openstack/openstackclient" Dec 05 20:25:18 crc kubenswrapper[4885]: I1205 20:25:18.095438 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/60b132f9-5036-44cd-8d19-e60a39760da0-openstack-config\") pod \"openstackclient\" (UID: \"60b132f9-5036-44cd-8d19-e60a39760da0\") " pod="openstack/openstackclient" Dec 05 20:25:18 crc kubenswrapper[4885]: I1205 20:25:18.100534 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b132f9-5036-44cd-8d19-e60a39760da0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"60b132f9-5036-44cd-8d19-e60a39760da0\") " pod="openstack/openstackclient" Dec 05 20:25:18 crc kubenswrapper[4885]: I1205 20:25:18.114578 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plhjk\" (UniqueName: \"kubernetes.io/projected/60b132f9-5036-44cd-8d19-e60a39760da0-kube-api-access-plhjk\") pod \"openstackclient\" (UID: \"60b132f9-5036-44cd-8d19-e60a39760da0\") " pod="openstack/openstackclient" Dec 05 20:25:18 crc kubenswrapper[4885]: I1205 20:25:18.123246 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/60b132f9-5036-44cd-8d19-e60a39760da0-openstack-config-secret\") pod \"openstackclient\" (UID: \"60b132f9-5036-44cd-8d19-e60a39760da0\") " pod="openstack/openstackclient" Dec 05 20:25:18 crc kubenswrapper[4885]: I1205 20:25:18.208852 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 20:25:18 crc kubenswrapper[4885]: W1205 20:25:18.696775 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60b132f9_5036_44cd_8d19_e60a39760da0.slice/crio-2e4de8eb1d99be0e26a12ea059c20ced28684d75cb607c783512ad8849e46039 WatchSource:0}: Error finding container 2e4de8eb1d99be0e26a12ea059c20ced28684d75cb607c783512ad8849e46039: Status 404 returned error can't find the container with id 2e4de8eb1d99be0e26a12ea059c20ced28684d75cb607c783512ad8849e46039 Dec 05 20:25:18 crc kubenswrapper[4885]: I1205 20:25:18.702221 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 20:25:18 crc kubenswrapper[4885]: I1205 20:25:18.935213 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"60b132f9-5036-44cd-8d19-e60a39760da0","Type":"ContainerStarted","Data":"2e4de8eb1d99be0e26a12ea059c20ced28684d75cb607c783512ad8849e46039"} Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.378687 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5ff897b49b-mz22t" podUID="b6418540-35fb-49a0-8e02-8540c41d59f1" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:40678->10.217.0.164:9311: read: connection reset by peer" Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.378763 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5ff897b49b-mz22t" podUID="b6418540-35fb-49a0-8e02-8540c41d59f1" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:40686->10.217.0.164:9311: read: connection reset by peer" Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.379529 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5ff897b49b-mz22t" podUID="b6418540-35fb-49a0-8e02-8540c41d59f1" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": dial tcp 10.217.0.164:9311: connect: connection refused" Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.379623 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.421195 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.830833 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.953711 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f62qk\" (UniqueName: \"kubernetes.io/projected/b6418540-35fb-49a0-8e02-8540c41d59f1-kube-api-access-f62qk\") pod \"b6418540-35fb-49a0-8e02-8540c41d59f1\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.953714 4885 generic.go:334] "Generic (PLEG): container finished" podID="b6418540-35fb-49a0-8e02-8540c41d59f1" containerID="3a0107ae0a3e4161f64caa2c0528076f60e8067d2eb31b066e23c73178dcbb9e" exitCode=0 Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.953756 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-combined-ca-bundle\") pod \"b6418540-35fb-49a0-8e02-8540c41d59f1\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.953760 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5ff897b49b-mz22t" event={"ID":"b6418540-35fb-49a0-8e02-8540c41d59f1","Type":"ContainerDied","Data":"3a0107ae0a3e4161f64caa2c0528076f60e8067d2eb31b066e23c73178dcbb9e"} Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.953782 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5ff897b49b-mz22t" Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.953802 4885 scope.go:117] "RemoveContainer" containerID="3a0107ae0a3e4161f64caa2c0528076f60e8067d2eb31b066e23c73178dcbb9e" Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.953872 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-config-data-custom\") pod \"b6418540-35fb-49a0-8e02-8540c41d59f1\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.954039 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-config-data\") pod \"b6418540-35fb-49a0-8e02-8540c41d59f1\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.954123 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6418540-35fb-49a0-8e02-8540c41d59f1-logs\") pod \"b6418540-35fb-49a0-8e02-8540c41d59f1\" (UID: \"b6418540-35fb-49a0-8e02-8540c41d59f1\") " Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.953789 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5ff897b49b-mz22t" event={"ID":"b6418540-35fb-49a0-8e02-8540c41d59f1","Type":"ContainerDied","Data":"4ba494d34d8de0a4199402322f9601dc9a8e575876a89a8f1e21df9ca8cb6408"} Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.958451 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6418540-35fb-49a0-8e02-8540c41d59f1-logs" (OuterVolumeSpecName: "logs") pod "b6418540-35fb-49a0-8e02-8540c41d59f1" (UID: "b6418540-35fb-49a0-8e02-8540c41d59f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.963199 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b6418540-35fb-49a0-8e02-8540c41d59f1" (UID: "b6418540-35fb-49a0-8e02-8540c41d59f1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.963439 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6418540-35fb-49a0-8e02-8540c41d59f1-kube-api-access-f62qk" (OuterVolumeSpecName: "kube-api-access-f62qk") pod "b6418540-35fb-49a0-8e02-8540c41d59f1" (UID: "b6418540-35fb-49a0-8e02-8540c41d59f1"). InnerVolumeSpecName "kube-api-access-f62qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:20 crc kubenswrapper[4885]: I1205 20:25:20.987921 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6418540-35fb-49a0-8e02-8540c41d59f1" (UID: "b6418540-35fb-49a0-8e02-8540c41d59f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.028384 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-config-data" (OuterVolumeSpecName: "config-data") pod "b6418540-35fb-49a0-8e02-8540c41d59f1" (UID: "b6418540-35fb-49a0-8e02-8540c41d59f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.053806 4885 scope.go:117] "RemoveContainer" containerID="b9476cb53586e59a9259b0918a05c08ac0674098e4c3f82b1fad4af5482b18d3" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.057671 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f62qk\" (UniqueName: \"kubernetes.io/projected/b6418540-35fb-49a0-8e02-8540c41d59f1-kube-api-access-f62qk\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.057702 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.057717 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.057728 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6418540-35fb-49a0-8e02-8540c41d59f1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.057739 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6418540-35fb-49a0-8e02-8540c41d59f1-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.082583 4885 scope.go:117] "RemoveContainer" containerID="3a0107ae0a3e4161f64caa2c0528076f60e8067d2eb31b066e23c73178dcbb9e" Dec 05 20:25:21 crc kubenswrapper[4885]: E1205 20:25:21.083097 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a0107ae0a3e4161f64caa2c0528076f60e8067d2eb31b066e23c73178dcbb9e\": container with ID starting with 3a0107ae0a3e4161f64caa2c0528076f60e8067d2eb31b066e23c73178dcbb9e not found: ID does not exist" containerID="3a0107ae0a3e4161f64caa2c0528076f60e8067d2eb31b066e23c73178dcbb9e" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.083142 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a0107ae0a3e4161f64caa2c0528076f60e8067d2eb31b066e23c73178dcbb9e"} err="failed to get container status \"3a0107ae0a3e4161f64caa2c0528076f60e8067d2eb31b066e23c73178dcbb9e\": rpc error: code = NotFound desc = could not find container \"3a0107ae0a3e4161f64caa2c0528076f60e8067d2eb31b066e23c73178dcbb9e\": container with ID starting with 3a0107ae0a3e4161f64caa2c0528076f60e8067d2eb31b066e23c73178dcbb9e not found: ID does not exist" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.083167 4885 scope.go:117] "RemoveContainer" containerID="b9476cb53586e59a9259b0918a05c08ac0674098e4c3f82b1fad4af5482b18d3" Dec 05 20:25:21 crc kubenswrapper[4885]: E1205 20:25:21.083566 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9476cb53586e59a9259b0918a05c08ac0674098e4c3f82b1fad4af5482b18d3\": container with ID starting with b9476cb53586e59a9259b0918a05c08ac0674098e4c3f82b1fad4af5482b18d3 not found: ID does not exist" containerID="b9476cb53586e59a9259b0918a05c08ac0674098e4c3f82b1fad4af5482b18d3" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.083990 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9476cb53586e59a9259b0918a05c08ac0674098e4c3f82b1fad4af5482b18d3"} err="failed to get container status \"b9476cb53586e59a9259b0918a05c08ac0674098e4c3f82b1fad4af5482b18d3\": rpc error: code = NotFound desc = could not find container \"b9476cb53586e59a9259b0918a05c08ac0674098e4c3f82b1fad4af5482b18d3\": container with ID starting with b9476cb53586e59a9259b0918a05c08ac0674098e4c3f82b1fad4af5482b18d3 not found: ID does not exist" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.143954 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-56b6f678f7-nt7kq"] Dec 05 20:25:21 crc kubenswrapper[4885]: E1205 20:25:21.144411 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6418540-35fb-49a0-8e02-8540c41d59f1" containerName="barbican-api-log" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.144432 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6418540-35fb-49a0-8e02-8540c41d59f1" containerName="barbican-api-log" Dec 05 20:25:21 crc kubenswrapper[4885]: E1205 20:25:21.148646 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6418540-35fb-49a0-8e02-8540c41d59f1" containerName="barbican-api" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.148678 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6418540-35fb-49a0-8e02-8540c41d59f1" containerName="barbican-api" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.149050 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6418540-35fb-49a0-8e02-8540c41d59f1" containerName="barbican-api-log" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.149103 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6418540-35fb-49a0-8e02-8540c41d59f1" containerName="barbican-api" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.150263 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.152295 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.152804 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.154232 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.155473 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-56b6f678f7-nt7kq"] Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.262605 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgl7l\" (UniqueName: \"kubernetes.io/projected/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-kube-api-access-vgl7l\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.262674 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-combined-ca-bundle\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.262698 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-etc-swift\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.262719 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-public-tls-certs\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.262738 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-log-httpd\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.262826 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-run-httpd\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.262850 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-config-data\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.262888 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-internal-tls-certs\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.300288 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5ff897b49b-mz22t"] Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.319520 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5ff897b49b-mz22t"] Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.364125 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-run-httpd\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.364202 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-config-data\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.364262 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-internal-tls-certs\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.364305 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgl7l\" (UniqueName: \"kubernetes.io/projected/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-kube-api-access-vgl7l\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.364379 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-combined-ca-bundle\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.364406 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-etc-swift\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.364436 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-public-tls-certs\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.364462 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-log-httpd\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.365604 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-log-httpd\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.368248 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-run-httpd\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.394253 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-combined-ca-bundle\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.395168 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-config-data\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.396271 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-etc-swift\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.396748 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-public-tls-certs\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.397279 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgl7l\" (UniqueName: \"kubernetes.io/projected/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-kube-api-access-vgl7l\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.397544 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df6ff8a-e66c-402d-a7cd-63125b9c6cae-internal-tls-certs\") pod \"swift-proxy-56b6f678f7-nt7kq\" (UID: \"5df6ff8a-e66c-402d-a7cd-63125b9c6cae\") " pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.471712 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.897504 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.898031 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="ceilometer-central-agent" containerID="cri-o://7477b3c060d022f8f50cf4bd9b5f3036aea8e423314f2393aedf009bdffd1f91" gracePeriod=30 Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.898147 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="sg-core" containerID="cri-o://03a24ddc48d7509b7fe69939b8d26cf41d3ee07988222a741a9025493e4c0d22" gracePeriod=30 Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.898179 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="ceilometer-notification-agent" containerID="cri-o://82480b8bca44fa4540af3d46d873a0f5a5e6e84917799b9e43d88e1171d88b47" gracePeriod=30 Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.898146 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="proxy-httpd" containerID="cri-o://13798c03742164cd1289573e7496d5961fc8ec84954a964eac4af8c94c8e774e" gracePeriod=30 Dec 05 20:25:21 crc kubenswrapper[4885]: I1205 20:25:21.902986 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.155:3000/\": EOF" Dec 05 20:25:22 crc kubenswrapper[4885]: I1205 20:25:22.037489 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-56b6f678f7-nt7kq"] Dec 05 20:25:22 crc kubenswrapper[4885]: I1205 20:25:22.972721 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-56b6f678f7-nt7kq" event={"ID":"5df6ff8a-e66c-402d-a7cd-63125b9c6cae","Type":"ContainerStarted","Data":"369d45712f0bc6adfe78c5c6b0aa56701d3f3a27b17a29c09c306e7e942474be"} Dec 05 20:25:22 crc kubenswrapper[4885]: I1205 20:25:22.973223 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-56b6f678f7-nt7kq" event={"ID":"5df6ff8a-e66c-402d-a7cd-63125b9c6cae","Type":"ContainerStarted","Data":"712926b520742b827a60d2844e4fbcb6e3e704b9ab39a3a6575c303dda925a58"} Dec 05 20:25:22 crc kubenswrapper[4885]: I1205 20:25:22.973238 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-56b6f678f7-nt7kq" event={"ID":"5df6ff8a-e66c-402d-a7cd-63125b9c6cae","Type":"ContainerStarted","Data":"21c9755cd7832413e59c554b32cfb41f521e70fae377bf0977aed8531c36c7d9"} Dec 05 20:25:22 crc kubenswrapper[4885]: I1205 20:25:22.973284 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:22 crc kubenswrapper[4885]: I1205 20:25:22.973302 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:22 crc kubenswrapper[4885]: I1205 20:25:22.975369 4885 generic.go:334] "Generic (PLEG): container finished" podID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerID="13798c03742164cd1289573e7496d5961fc8ec84954a964eac4af8c94c8e774e" exitCode=0 Dec 05 20:25:22 crc kubenswrapper[4885]: I1205 20:25:22.975391 4885 generic.go:334] "Generic (PLEG): container finished" podID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerID="03a24ddc48d7509b7fe69939b8d26cf41d3ee07988222a741a9025493e4c0d22" exitCode=2 Dec 05 20:25:22 crc kubenswrapper[4885]: I1205 20:25:22.975398 4885 generic.go:334] "Generic (PLEG): container finished" podID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerID="7477b3c060d022f8f50cf4bd9b5f3036aea8e423314f2393aedf009bdffd1f91" exitCode=0 Dec 05 20:25:22 crc kubenswrapper[4885]: I1205 20:25:22.975412 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"962a5840-991a-4f47-960f-b75f1bc33fa8","Type":"ContainerDied","Data":"13798c03742164cd1289573e7496d5961fc8ec84954a964eac4af8c94c8e774e"} Dec 05 20:25:22 crc kubenswrapper[4885]: I1205 20:25:22.975429 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"962a5840-991a-4f47-960f-b75f1bc33fa8","Type":"ContainerDied","Data":"03a24ddc48d7509b7fe69939b8d26cf41d3ee07988222a741a9025493e4c0d22"} Dec 05 20:25:22 crc kubenswrapper[4885]: I1205 20:25:22.975439 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"962a5840-991a-4f47-960f-b75f1bc33fa8","Type":"ContainerDied","Data":"7477b3c060d022f8f50cf4bd9b5f3036aea8e423314f2393aedf009bdffd1f91"} Dec 05 20:25:23 crc kubenswrapper[4885]: I1205 20:25:23.014315 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.155:3000/\": dial tcp 10.217.0.155:3000: connect: connection refused" Dec 05 20:25:23 crc kubenswrapper[4885]: I1205 20:25:23.182527 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6418540-35fb-49a0-8e02-8540c41d59f1" path="/var/lib/kubelet/pods/b6418540-35fb-49a0-8e02-8540c41d59f1/volumes" Dec 05 20:25:25 crc kubenswrapper[4885]: I1205 20:25:25.202216 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-56b6f678f7-nt7kq" podStartSLOduration=4.202200164 podStartE2EDuration="4.202200164s" podCreationTimestamp="2025-12-05 20:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:25:22.992576087 +0000 UTC m=+1188.289391738" watchObservedRunningTime="2025-12-05 20:25:25.202200164 +0000 UTC m=+1190.499015825" Dec 05 20:25:26 crc kubenswrapper[4885]: I1205 20:25:26.018056 4885 generic.go:334] "Generic (PLEG): container finished" podID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerID="82480b8bca44fa4540af3d46d873a0f5a5e6e84917799b9e43d88e1171d88b47" exitCode=0 Dec 05 20:25:26 crc kubenswrapper[4885]: I1205 20:25:26.018109 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"962a5840-991a-4f47-960f-b75f1bc33fa8","Type":"ContainerDied","Data":"82480b8bca44fa4540af3d46d873a0f5a5e6e84917799b9e43d88e1171d88b47"} Dec 05 20:25:26 crc kubenswrapper[4885]: I1205 20:25:26.874356 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7ddb869454-vvfd9" podUID="58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 05 20:25:31 crc kubenswrapper[4885]: I1205 20:25:31.477840 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:31 crc kubenswrapper[4885]: I1205 20:25:31.482798 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-56b6f678f7-nt7kq" Dec 05 20:25:31 crc kubenswrapper[4885]: I1205 20:25:31.866093 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:25:31 crc kubenswrapper[4885]: I1205 20:25:31.911663 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44cq9\" (UniqueName: \"kubernetes.io/projected/962a5840-991a-4f47-960f-b75f1bc33fa8-kube-api-access-44cq9\") pod \"962a5840-991a-4f47-960f-b75f1bc33fa8\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " Dec 05 20:25:31 crc kubenswrapper[4885]: I1205 20:25:31.911740 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-sg-core-conf-yaml\") pod \"962a5840-991a-4f47-960f-b75f1bc33fa8\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " Dec 05 20:25:31 crc kubenswrapper[4885]: I1205 20:25:31.911802 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-combined-ca-bundle\") pod \"962a5840-991a-4f47-960f-b75f1bc33fa8\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " Dec 05 20:25:31 crc kubenswrapper[4885]: I1205 20:25:31.911838 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-scripts\") pod \"962a5840-991a-4f47-960f-b75f1bc33fa8\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " Dec 05 20:25:31 crc kubenswrapper[4885]: I1205 20:25:31.911924 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/962a5840-991a-4f47-960f-b75f1bc33fa8-run-httpd\") pod \"962a5840-991a-4f47-960f-b75f1bc33fa8\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " Dec 05 20:25:31 crc kubenswrapper[4885]: I1205 20:25:31.911970 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-config-data\") pod \"962a5840-991a-4f47-960f-b75f1bc33fa8\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " Dec 05 20:25:31 crc kubenswrapper[4885]: I1205 20:25:31.912056 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/962a5840-991a-4f47-960f-b75f1bc33fa8-log-httpd\") pod \"962a5840-991a-4f47-960f-b75f1bc33fa8\" (UID: \"962a5840-991a-4f47-960f-b75f1bc33fa8\") " Dec 05 20:25:31 crc kubenswrapper[4885]: I1205 20:25:31.912389 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/962a5840-991a-4f47-960f-b75f1bc33fa8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "962a5840-991a-4f47-960f-b75f1bc33fa8" (UID: "962a5840-991a-4f47-960f-b75f1bc33fa8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:31 crc kubenswrapper[4885]: I1205 20:25:31.912515 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/962a5840-991a-4f47-960f-b75f1bc33fa8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "962a5840-991a-4f47-960f-b75f1bc33fa8" (UID: "962a5840-991a-4f47-960f-b75f1bc33fa8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:31 crc kubenswrapper[4885]: I1205 20:25:31.912642 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/962a5840-991a-4f47-960f-b75f1bc33fa8-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:31 crc kubenswrapper[4885]: I1205 20:25:31.912667 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/962a5840-991a-4f47-960f-b75f1bc33fa8-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:31 crc kubenswrapper[4885]: I1205 20:25:31.926222 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-scripts" (OuterVolumeSpecName: "scripts") pod "962a5840-991a-4f47-960f-b75f1bc33fa8" (UID: "962a5840-991a-4f47-960f-b75f1bc33fa8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:31 crc kubenswrapper[4885]: I1205 20:25:31.934279 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962a5840-991a-4f47-960f-b75f1bc33fa8-kube-api-access-44cq9" (OuterVolumeSpecName: "kube-api-access-44cq9") pod "962a5840-991a-4f47-960f-b75f1bc33fa8" (UID: "962a5840-991a-4f47-960f-b75f1bc33fa8"). InnerVolumeSpecName "kube-api-access-44cq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.014369 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44cq9\" (UniqueName: \"kubernetes.io/projected/962a5840-991a-4f47-960f-b75f1bc33fa8-kube-api-access-44cq9\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.014409 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.031718 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "962a5840-991a-4f47-960f-b75f1bc33fa8" (UID: "962a5840-991a-4f47-960f-b75f1bc33fa8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.073660 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "962a5840-991a-4f47-960f-b75f1bc33fa8" (UID: "962a5840-991a-4f47-960f-b75f1bc33fa8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.077382 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"962a5840-991a-4f47-960f-b75f1bc33fa8","Type":"ContainerDied","Data":"64141964f111aee59ad5027cc88e28904f16af8a8b3e6f9b067cfef82ea32041"} Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.077436 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.077439 4885 scope.go:117] "RemoveContainer" containerID="13798c03742164cd1289573e7496d5961fc8ec84954a964eac4af8c94c8e774e" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.086200 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-config-data" (OuterVolumeSpecName: "config-data") pod "962a5840-991a-4f47-960f-b75f1bc33fa8" (UID: "962a5840-991a-4f47-960f-b75f1bc33fa8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.103386 4885 scope.go:117] "RemoveContainer" containerID="03a24ddc48d7509b7fe69939b8d26cf41d3ee07988222a741a9025493e4c0d22" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.116682 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.116738 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.116749 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962a5840-991a-4f47-960f-b75f1bc33fa8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.127724 4885 scope.go:117] "RemoveContainer" containerID="82480b8bca44fa4540af3d46d873a0f5a5e6e84917799b9e43d88e1171d88b47" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.168285 4885 scope.go:117] "RemoveContainer" containerID="7477b3c060d022f8f50cf4bd9b5f3036aea8e423314f2393aedf009bdffd1f91" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.412456 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.424035 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.432936 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:25:32 crc kubenswrapper[4885]: E1205 20:25:32.433427 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="ceilometer-central-agent" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.433458 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="ceilometer-central-agent" Dec 05 20:25:32 crc kubenswrapper[4885]: E1205 20:25:32.433469 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="ceilometer-notification-agent" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.433477 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="ceilometer-notification-agent" Dec 05 20:25:32 crc kubenswrapper[4885]: E1205 20:25:32.433517 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="proxy-httpd" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.433523 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="proxy-httpd" Dec 05 20:25:32 crc kubenswrapper[4885]: E1205 20:25:32.433530 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="sg-core" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.433536 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="sg-core" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.433697 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="proxy-httpd" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.433711 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="sg-core" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.433720 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="ceilometer-notification-agent" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.433732 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" containerName="ceilometer-central-agent" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.435351 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.437810 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.437969 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.448707 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.523506 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-config-data\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.523564 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2988bfcd-6a34-4a15-8a36-953ca658c25b-run-httpd\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.523588 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2988bfcd-6a34-4a15-8a36-953ca658c25b-log-httpd\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.523654 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.523707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqh9d\" (UniqueName: \"kubernetes.io/projected/2988bfcd-6a34-4a15-8a36-953ca658c25b-kube-api-access-lqh9d\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.523727 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-scripts\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.523780 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.625484 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.625562 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-config-data\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.625624 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2988bfcd-6a34-4a15-8a36-953ca658c25b-run-httpd\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.625645 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2988bfcd-6a34-4a15-8a36-953ca658c25b-log-httpd\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.625679 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.625740 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqh9d\" (UniqueName: \"kubernetes.io/projected/2988bfcd-6a34-4a15-8a36-953ca658c25b-kube-api-access-lqh9d\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.625759 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-scripts\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.626888 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2988bfcd-6a34-4a15-8a36-953ca658c25b-run-httpd\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.627141 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2988bfcd-6a34-4a15-8a36-953ca658c25b-log-httpd\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.629698 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.629885 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-config-data\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.630953 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.631658 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-scripts\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.643502 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqh9d\" (UniqueName: \"kubernetes.io/projected/2988bfcd-6a34-4a15-8a36-953ca658c25b-kube-api-access-lqh9d\") pod \"ceilometer-0\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " pod="openstack/ceilometer-0" Dec 05 20:25:32 crc kubenswrapper[4885]: I1205 20:25:32.753282 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:25:33 crc kubenswrapper[4885]: I1205 20:25:33.087157 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"60b132f9-5036-44cd-8d19-e60a39760da0","Type":"ContainerStarted","Data":"fcdcb7fecebb905bea68a295e2bda15717c20e01c6a019b492687c627b2d3381"} Dec 05 20:25:33 crc kubenswrapper[4885]: I1205 20:25:33.110451 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.1177714 podStartE2EDuration="16.110433196s" podCreationTimestamp="2025-12-05 20:25:17 +0000 UTC" firstStartedPulling="2025-12-05 20:25:18.69958381 +0000 UTC m=+1183.996399471" lastFinishedPulling="2025-12-05 20:25:31.692245606 +0000 UTC m=+1196.989061267" observedRunningTime="2025-12-05 20:25:33.101800687 +0000 UTC m=+1198.398616348" watchObservedRunningTime="2025-12-05 20:25:33.110433196 +0000 UTC m=+1198.407248857" Dec 05 20:25:33 crc kubenswrapper[4885]: W1205 20:25:33.184073 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2988bfcd_6a34_4a15_8a36_953ca658c25b.slice/crio-dc8154d04962bf7d3ea895a0b5eae8568474085393521daa75fa73b00eebcbff WatchSource:0}: Error finding container dc8154d04962bf7d3ea895a0b5eae8568474085393521daa75fa73b00eebcbff: Status 404 returned error can't find the container with id dc8154d04962bf7d3ea895a0b5eae8568474085393521daa75fa73b00eebcbff Dec 05 20:25:33 crc kubenswrapper[4885]: I1205 20:25:33.186300 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962a5840-991a-4f47-960f-b75f1bc33fa8" path="/var/lib/kubelet/pods/962a5840-991a-4f47-960f-b75f1bc33fa8/volumes" Dec 05 20:25:33 crc kubenswrapper[4885]: I1205 20:25:33.187288 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:25:33 crc kubenswrapper[4885]: I1205 20:25:33.878846 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.032280 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-4tgwl"] Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.036165 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4tgwl" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.063665 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4tgwl"] Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.090354 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed6ed529-d71f-4427-b906-ec6d3e9c33f0-operator-scripts\") pod \"nova-api-db-create-4tgwl\" (UID: \"ed6ed529-d71f-4427-b906-ec6d3e9c33f0\") " pod="openstack/nova-api-db-create-4tgwl" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.090482 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txphl\" (UniqueName: \"kubernetes.io/projected/ed6ed529-d71f-4427-b906-ec6d3e9c33f0-kube-api-access-txphl\") pod \"nova-api-db-create-4tgwl\" (UID: \"ed6ed529-d71f-4427-b906-ec6d3e9c33f0\") " pod="openstack/nova-api-db-create-4tgwl" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.119149 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2988bfcd-6a34-4a15-8a36-953ca658c25b","Type":"ContainerStarted","Data":"c6f72940037f69966fec82ea83d08b1b90d7c3fd3db14a2741ca5acd161a70b4"} Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.119226 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2988bfcd-6a34-4a15-8a36-953ca658c25b","Type":"ContainerStarted","Data":"dc8154d04962bf7d3ea895a0b5eae8568474085393521daa75fa73b00eebcbff"} Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.135695 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-cn68w"] Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.136883 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-cn68w" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.177578 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-cn68w"] Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.191249 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-e423-account-create-update-4n6sv"] Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.192467 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llnwv\" (UniqueName: \"kubernetes.io/projected/91fa7e5b-9ed9-44de-bd54-105f4608ddb6-kube-api-access-llnwv\") pod \"nova-cell0-db-create-cn68w\" (UID: \"91fa7e5b-9ed9-44de-bd54-105f4608ddb6\") " pod="openstack/nova-cell0-db-create-cn68w" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.192523 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91fa7e5b-9ed9-44de-bd54-105f4608ddb6-operator-scripts\") pod \"nova-cell0-db-create-cn68w\" (UID: \"91fa7e5b-9ed9-44de-bd54-105f4608ddb6\") " pod="openstack/nova-cell0-db-create-cn68w" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.192650 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed6ed529-d71f-4427-b906-ec6d3e9c33f0-operator-scripts\") pod \"nova-api-db-create-4tgwl\" (UID: \"ed6ed529-d71f-4427-b906-ec6d3e9c33f0\") " pod="openstack/nova-api-db-create-4tgwl" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.192689 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txphl\" (UniqueName: \"kubernetes.io/projected/ed6ed529-d71f-4427-b906-ec6d3e9c33f0-kube-api-access-txphl\") pod \"nova-api-db-create-4tgwl\" (UID: \"ed6ed529-d71f-4427-b906-ec6d3e9c33f0\") " pod="openstack/nova-api-db-create-4tgwl" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.194310 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed6ed529-d71f-4427-b906-ec6d3e9c33f0-operator-scripts\") pod \"nova-api-db-create-4tgwl\" (UID: \"ed6ed529-d71f-4427-b906-ec6d3e9c33f0\") " pod="openstack/nova-api-db-create-4tgwl" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.197429 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e423-account-create-update-4n6sv" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.203950 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e423-account-create-update-4n6sv"] Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.206385 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.220858 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txphl\" (UniqueName: \"kubernetes.io/projected/ed6ed529-d71f-4427-b906-ec6d3e9c33f0-kube-api-access-txphl\") pod \"nova-api-db-create-4tgwl\" (UID: \"ed6ed529-d71f-4427-b906-ec6d3e9c33f0\") " pod="openstack/nova-api-db-create-4tgwl" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.297763 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llnwv\" (UniqueName: \"kubernetes.io/projected/91fa7e5b-9ed9-44de-bd54-105f4608ddb6-kube-api-access-llnwv\") pod \"nova-cell0-db-create-cn68w\" (UID: \"91fa7e5b-9ed9-44de-bd54-105f4608ddb6\") " pod="openstack/nova-cell0-db-create-cn68w" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.298148 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91fa7e5b-9ed9-44de-bd54-105f4608ddb6-operator-scripts\") pod \"nova-cell0-db-create-cn68w\" (UID: \"91fa7e5b-9ed9-44de-bd54-105f4608ddb6\") " pod="openstack/nova-cell0-db-create-cn68w" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.298181 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f88ff47-91a3-4fb9-9526-cc39661cbeec-operator-scripts\") pod \"nova-api-e423-account-create-update-4n6sv\" (UID: \"8f88ff47-91a3-4fb9-9526-cc39661cbeec\") " pod="openstack/nova-api-e423-account-create-update-4n6sv" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.298389 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ccgf\" (UniqueName: \"kubernetes.io/projected/8f88ff47-91a3-4fb9-9526-cc39661cbeec-kube-api-access-9ccgf\") pod \"nova-api-e423-account-create-update-4n6sv\" (UID: \"8f88ff47-91a3-4fb9-9526-cc39661cbeec\") " pod="openstack/nova-api-e423-account-create-update-4n6sv" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.299188 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91fa7e5b-9ed9-44de-bd54-105f4608ddb6-operator-scripts\") pod \"nova-cell0-db-create-cn68w\" (UID: \"91fa7e5b-9ed9-44de-bd54-105f4608ddb6\") " pod="openstack/nova-cell0-db-create-cn68w" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.323719 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-k4lkf"] Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.326114 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k4lkf" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.335450 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llnwv\" (UniqueName: \"kubernetes.io/projected/91fa7e5b-9ed9-44de-bd54-105f4608ddb6-kube-api-access-llnwv\") pod \"nova-cell0-db-create-cn68w\" (UID: \"91fa7e5b-9ed9-44de-bd54-105f4608ddb6\") " pod="openstack/nova-cell0-db-create-cn68w" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.352402 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-k4lkf"] Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.367211 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7478-account-create-update-7whgg"] Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.368281 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7478-account-create-update-7whgg" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.368779 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4tgwl" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.373210 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.380951 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7478-account-create-update-7whgg"] Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.400466 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f88ff47-91a3-4fb9-9526-cc39661cbeec-operator-scripts\") pod \"nova-api-e423-account-create-update-4n6sv\" (UID: \"8f88ff47-91a3-4fb9-9526-cc39661cbeec\") " pod="openstack/nova-api-e423-account-create-update-4n6sv" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.400526 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/607fd1c0-165f-465f-bdd4-134ab3451a51-operator-scripts\") pod \"nova-cell1-db-create-k4lkf\" (UID: \"607fd1c0-165f-465f-bdd4-134ab3451a51\") " pod="openstack/nova-cell1-db-create-k4lkf" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.400566 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsnc5\" (UniqueName: \"kubernetes.io/projected/4e7ef23d-578c-43d5-b7eb-a15cefb90d03-kube-api-access-jsnc5\") pod \"nova-cell0-7478-account-create-update-7whgg\" (UID: \"4e7ef23d-578c-43d5-b7eb-a15cefb90d03\") " pod="openstack/nova-cell0-7478-account-create-update-7whgg" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.400596 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e7ef23d-578c-43d5-b7eb-a15cefb90d03-operator-scripts\") pod \"nova-cell0-7478-account-create-update-7whgg\" (UID: \"4e7ef23d-578c-43d5-b7eb-a15cefb90d03\") " pod="openstack/nova-cell0-7478-account-create-update-7whgg" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.400624 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ccgf\" (UniqueName: \"kubernetes.io/projected/8f88ff47-91a3-4fb9-9526-cc39661cbeec-kube-api-access-9ccgf\") pod \"nova-api-e423-account-create-update-4n6sv\" (UID: \"8f88ff47-91a3-4fb9-9526-cc39661cbeec\") " pod="openstack/nova-api-e423-account-create-update-4n6sv" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.403327 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f88ff47-91a3-4fb9-9526-cc39661cbeec-operator-scripts\") pod \"nova-api-e423-account-create-update-4n6sv\" (UID: \"8f88ff47-91a3-4fb9-9526-cc39661cbeec\") " pod="openstack/nova-api-e423-account-create-update-4n6sv" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.403365 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx4nj\" (UniqueName: \"kubernetes.io/projected/607fd1c0-165f-465f-bdd4-134ab3451a51-kube-api-access-qx4nj\") pod \"nova-cell1-db-create-k4lkf\" (UID: \"607fd1c0-165f-465f-bdd4-134ab3451a51\") " pod="openstack/nova-cell1-db-create-k4lkf" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.426602 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ccgf\" (UniqueName: \"kubernetes.io/projected/8f88ff47-91a3-4fb9-9526-cc39661cbeec-kube-api-access-9ccgf\") pod \"nova-api-e423-account-create-update-4n6sv\" (UID: \"8f88ff47-91a3-4fb9-9526-cc39661cbeec\") " pod="openstack/nova-api-e423-account-create-update-4n6sv" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.461649 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-cn68w" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.506244 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e7ef23d-578c-43d5-b7eb-a15cefb90d03-operator-scripts\") pod \"nova-cell0-7478-account-create-update-7whgg\" (UID: \"4e7ef23d-578c-43d5-b7eb-a15cefb90d03\") " pod="openstack/nova-cell0-7478-account-create-update-7whgg" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.506809 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx4nj\" (UniqueName: \"kubernetes.io/projected/607fd1c0-165f-465f-bdd4-134ab3451a51-kube-api-access-qx4nj\") pod \"nova-cell1-db-create-k4lkf\" (UID: \"607fd1c0-165f-465f-bdd4-134ab3451a51\") " pod="openstack/nova-cell1-db-create-k4lkf" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.507270 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/607fd1c0-165f-465f-bdd4-134ab3451a51-operator-scripts\") pod \"nova-cell1-db-create-k4lkf\" (UID: \"607fd1c0-165f-465f-bdd4-134ab3451a51\") " pod="openstack/nova-cell1-db-create-k4lkf" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.510169 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsnc5\" (UniqueName: \"kubernetes.io/projected/4e7ef23d-578c-43d5-b7eb-a15cefb90d03-kube-api-access-jsnc5\") pod \"nova-cell0-7478-account-create-update-7whgg\" (UID: \"4e7ef23d-578c-43d5-b7eb-a15cefb90d03\") " pod="openstack/nova-cell0-7478-account-create-update-7whgg" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.517044 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e423-account-create-update-4n6sv" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.520244 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/607fd1c0-165f-465f-bdd4-134ab3451a51-operator-scripts\") pod \"nova-cell1-db-create-k4lkf\" (UID: \"607fd1c0-165f-465f-bdd4-134ab3451a51\") " pod="openstack/nova-cell1-db-create-k4lkf" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.520725 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e7ef23d-578c-43d5-b7eb-a15cefb90d03-operator-scripts\") pod \"nova-cell0-7478-account-create-update-7whgg\" (UID: \"4e7ef23d-578c-43d5-b7eb-a15cefb90d03\") " pod="openstack/nova-cell0-7478-account-create-update-7whgg" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.545548 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsnc5\" (UniqueName: \"kubernetes.io/projected/4e7ef23d-578c-43d5-b7eb-a15cefb90d03-kube-api-access-jsnc5\") pod \"nova-cell0-7478-account-create-update-7whgg\" (UID: \"4e7ef23d-578c-43d5-b7eb-a15cefb90d03\") " pod="openstack/nova-cell0-7478-account-create-update-7whgg" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.550469 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx4nj\" (UniqueName: \"kubernetes.io/projected/607fd1c0-165f-465f-bdd4-134ab3451a51-kube-api-access-qx4nj\") pod \"nova-cell1-db-create-k4lkf\" (UID: \"607fd1c0-165f-465f-bdd4-134ab3451a51\") " pod="openstack/nova-cell1-db-create-k4lkf" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.571132 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-d424-account-create-update-jc98w"] Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.572299 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d424-account-create-update-jc98w" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.579287 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d424-account-create-update-jc98w"] Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.580473 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.614090 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/067e647c-7401-4dd7-9245-94d1675f1bb6-operator-scripts\") pod \"nova-cell1-d424-account-create-update-jc98w\" (UID: \"067e647c-7401-4dd7-9245-94d1675f1bb6\") " pod="openstack/nova-cell1-d424-account-create-update-jc98w" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.614189 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6zdz\" (UniqueName: \"kubernetes.io/projected/067e647c-7401-4dd7-9245-94d1675f1bb6-kube-api-access-q6zdz\") pod \"nova-cell1-d424-account-create-update-jc98w\" (UID: \"067e647c-7401-4dd7-9245-94d1675f1bb6\") " pod="openstack/nova-cell1-d424-account-create-update-jc98w" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.715966 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/067e647c-7401-4dd7-9245-94d1675f1bb6-operator-scripts\") pod \"nova-cell1-d424-account-create-update-jc98w\" (UID: \"067e647c-7401-4dd7-9245-94d1675f1bb6\") " pod="openstack/nova-cell1-d424-account-create-update-jc98w" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.716359 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6zdz\" (UniqueName: \"kubernetes.io/projected/067e647c-7401-4dd7-9245-94d1675f1bb6-kube-api-access-q6zdz\") pod \"nova-cell1-d424-account-create-update-jc98w\" (UID: \"067e647c-7401-4dd7-9245-94d1675f1bb6\") " pod="openstack/nova-cell1-d424-account-create-update-jc98w" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.717685 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/067e647c-7401-4dd7-9245-94d1675f1bb6-operator-scripts\") pod \"nova-cell1-d424-account-create-update-jc98w\" (UID: \"067e647c-7401-4dd7-9245-94d1675f1bb6\") " pod="openstack/nova-cell1-d424-account-create-update-jc98w" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.741696 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6zdz\" (UniqueName: \"kubernetes.io/projected/067e647c-7401-4dd7-9245-94d1675f1bb6-kube-api-access-q6zdz\") pod \"nova-cell1-d424-account-create-update-jc98w\" (UID: \"067e647c-7401-4dd7-9245-94d1675f1bb6\") " pod="openstack/nova-cell1-d424-account-create-update-jc98w" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.762298 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k4lkf" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.768513 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7478-account-create-update-7whgg" Dec 05 20:25:34 crc kubenswrapper[4885]: I1205 20:25:34.893044 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d424-account-create-update-jc98w" Dec 05 20:25:35 crc kubenswrapper[4885]: I1205 20:25:35.019359 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4tgwl"] Dec 05 20:25:35 crc kubenswrapper[4885]: I1205 20:25:35.132010 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2988bfcd-6a34-4a15-8a36-953ca658c25b","Type":"ContainerStarted","Data":"d42f20830f1129dad133bcf7c1f1d66e6cb376f1d6f24bb656ae09c84a2199d1"} Dec 05 20:25:35 crc kubenswrapper[4885]: I1205 20:25:35.165568 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-cn68w"] Dec 05 20:25:35 crc kubenswrapper[4885]: W1205 20:25:35.223453 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91fa7e5b_9ed9_44de_bd54_105f4608ddb6.slice/crio-dd86b15925d057bb6fe8bd7c8f8b3a4ddd178c0baf715be9652ab39b9ea97c70 WatchSource:0}: Error finding container dd86b15925d057bb6fe8bd7c8f8b3a4ddd178c0baf715be9652ab39b9ea97c70: Status 404 returned error can't find the container with id dd86b15925d057bb6fe8bd7c8f8b3a4ddd178c0baf715be9652ab39b9ea97c70 Dec 05 20:25:35 crc kubenswrapper[4885]: I1205 20:25:35.277061 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e423-account-create-update-4n6sv"] Dec 05 20:25:35 crc kubenswrapper[4885]: W1205 20:25:35.291051 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f88ff47_91a3_4fb9_9526_cc39661cbeec.slice/crio-43ae3bb29322928719a650e3490dd715f99cf2a05870d3f1d4e867cd081b6a19 WatchSource:0}: Error finding container 43ae3bb29322928719a650e3490dd715f99cf2a05870d3f1d4e867cd081b6a19: Status 404 returned error can't find the container with id 43ae3bb29322928719a650e3490dd715f99cf2a05870d3f1d4e867cd081b6a19 Dec 05 20:25:35 crc kubenswrapper[4885]: I1205 20:25:35.318257 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 05 20:25:35 crc kubenswrapper[4885]: I1205 20:25:35.640202 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7478-account-create-update-7whgg"] Dec 05 20:25:35 crc kubenswrapper[4885]: I1205 20:25:35.648787 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d424-account-create-update-jc98w"] Dec 05 20:25:35 crc kubenswrapper[4885]: I1205 20:25:35.653353 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 05 20:25:35 crc kubenswrapper[4885]: I1205 20:25:35.663465 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 05 20:25:35 crc kubenswrapper[4885]: I1205 20:25:35.772460 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-k4lkf"] Dec 05 20:25:35 crc kubenswrapper[4885]: W1205 20:25:35.782756 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod607fd1c0_165f_465f_bdd4_134ab3451a51.slice/crio-cca0b0e20705a038f05eade167686884379f5895b1531e824092991015276904 WatchSource:0}: Error finding container cca0b0e20705a038f05eade167686884379f5895b1531e824092991015276904: Status 404 returned error can't find the container with id cca0b0e20705a038f05eade167686884379f5895b1531e824092991015276904 Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.140091 4885 generic.go:334] "Generic (PLEG): container finished" podID="91fa7e5b-9ed9-44de-bd54-105f4608ddb6" containerID="9f22a37060b3f6581bd4a3301e5abe5fd3875cbe8d16330efa711db01a8cf445" exitCode=0 Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.140167 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-cn68w" event={"ID":"91fa7e5b-9ed9-44de-bd54-105f4608ddb6","Type":"ContainerDied","Data":"9f22a37060b3f6581bd4a3301e5abe5fd3875cbe8d16330efa711db01a8cf445"} Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.140197 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-cn68w" event={"ID":"91fa7e5b-9ed9-44de-bd54-105f4608ddb6","Type":"ContainerStarted","Data":"dd86b15925d057bb6fe8bd7c8f8b3a4ddd178c0baf715be9652ab39b9ea97c70"} Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.141547 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7478-account-create-update-7whgg" event={"ID":"4e7ef23d-578c-43d5-b7eb-a15cefb90d03","Type":"ContainerStarted","Data":"80b8999c0668b1027376f7aa419ede72180e2e7ef89819db0d9e40edc3aad5ec"} Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.143108 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k4lkf" event={"ID":"607fd1c0-165f-465f-bdd4-134ab3451a51","Type":"ContainerStarted","Data":"cca0b0e20705a038f05eade167686884379f5895b1531e824092991015276904"} Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.145174 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2988bfcd-6a34-4a15-8a36-953ca658c25b","Type":"ContainerStarted","Data":"844afc68efe31bca488eb22dda3d04c940fe5370c830f732a3314e6c59dadd6c"} Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.146480 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d424-account-create-update-jc98w" event={"ID":"067e647c-7401-4dd7-9245-94d1675f1bb6","Type":"ContainerStarted","Data":"658a7d7be1332c16c31805a804fef9c1036cf10fe1d7f8d14050756327a7db60"} Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.147817 4885 generic.go:334] "Generic (PLEG): container finished" podID="8f88ff47-91a3-4fb9-9526-cc39661cbeec" containerID="8330ce052621a2e1ea010ffed9518941ef80d224afab2b140f20682c4b33b5a5" exitCode=0 Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.147877 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e423-account-create-update-4n6sv" event={"ID":"8f88ff47-91a3-4fb9-9526-cc39661cbeec","Type":"ContainerDied","Data":"8330ce052621a2e1ea010ffed9518941ef80d224afab2b140f20682c4b33b5a5"} Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.147894 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e423-account-create-update-4n6sv" event={"ID":"8f88ff47-91a3-4fb9-9526-cc39661cbeec","Type":"ContainerStarted","Data":"43ae3bb29322928719a650e3490dd715f99cf2a05870d3f1d4e867cd081b6a19"} Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.150101 4885 generic.go:334] "Generic (PLEG): container finished" podID="ed6ed529-d71f-4427-b906-ec6d3e9c33f0" containerID="23832c1b8f6362618489f30ab3c7de95c873488a4ea5dddad673869c92c3c15e" exitCode=0 Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.150125 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4tgwl" event={"ID":"ed6ed529-d71f-4427-b906-ec6d3e9c33f0","Type":"ContainerDied","Data":"23832c1b8f6362618489f30ab3c7de95c873488a4ea5dddad673869c92c3c15e"} Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.150141 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4tgwl" event={"ID":"ed6ed529-d71f-4427-b906-ec6d3e9c33f0","Type":"ContainerStarted","Data":"4785a6c65811e986e8adfcc1b3f2e864d4fb91a27a22c0133750891d61bb486d"} Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.847919 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.848368 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6a2ee42f-a754-4128-a568-f321de7b1beb" containerName="glance-log" containerID="cri-o://6ab208155a9cb0552587aea741d5d3637c7f1b625991327a2e713b8532ee3134" gracePeriod=30 Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.848594 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6a2ee42f-a754-4128-a568-f321de7b1beb" containerName="glance-httpd" containerID="cri-o://7029cca0d3d40d2220c1c94a687d5932e7c88807b0abab2d26a4166e2917d62f" gracePeriod=30 Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.874080 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7ddb869454-vvfd9" podUID="58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 05 20:25:36 crc kubenswrapper[4885]: I1205 20:25:36.874204 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.159787 4885 generic.go:334] "Generic (PLEG): container finished" podID="6a2ee42f-a754-4128-a568-f321de7b1beb" containerID="6ab208155a9cb0552587aea741d5d3637c7f1b625991327a2e713b8532ee3134" exitCode=143 Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.159882 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a2ee42f-a754-4128-a568-f321de7b1beb","Type":"ContainerDied","Data":"6ab208155a9cb0552587aea741d5d3637c7f1b625991327a2e713b8532ee3134"} Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.162081 4885 generic.go:334] "Generic (PLEG): container finished" podID="4e7ef23d-578c-43d5-b7eb-a15cefb90d03" containerID="159eb634e9bcc31b97dbcbf6020bb37f641eecf16ffcfce2f60ff4da5650b8c1" exitCode=0 Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.162147 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7478-account-create-update-7whgg" event={"ID":"4e7ef23d-578c-43d5-b7eb-a15cefb90d03","Type":"ContainerDied","Data":"159eb634e9bcc31b97dbcbf6020bb37f641eecf16ffcfce2f60ff4da5650b8c1"} Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.163752 4885 generic.go:334] "Generic (PLEG): container finished" podID="607fd1c0-165f-465f-bdd4-134ab3451a51" containerID="d4ac867e8697362f04a9e106845ef431056cb0d2bf0cf8a5183b7a9c25b08545" exitCode=0 Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.163789 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k4lkf" event={"ID":"607fd1c0-165f-465f-bdd4-134ab3451a51","Type":"ContainerDied","Data":"d4ac867e8697362f04a9e106845ef431056cb0d2bf0cf8a5183b7a9c25b08545"} Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.170847 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2988bfcd-6a34-4a15-8a36-953ca658c25b","Type":"ContainerStarted","Data":"e3b72add245eb7162f06781a119a37e6f9946a6e57946d6774855add5ef4ea8d"} Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.171001 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="ceilometer-central-agent" containerID="cri-o://c6f72940037f69966fec82ea83d08b1b90d7c3fd3db14a2741ca5acd161a70b4" gracePeriod=30 Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.171239 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.171297 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="proxy-httpd" containerID="cri-o://e3b72add245eb7162f06781a119a37e6f9946a6e57946d6774855add5ef4ea8d" gracePeriod=30 Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.171354 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="sg-core" containerID="cri-o://844afc68efe31bca488eb22dda3d04c940fe5370c830f732a3314e6c59dadd6c" gracePeriod=30 Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.171408 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="ceilometer-notification-agent" containerID="cri-o://d42f20830f1129dad133bcf7c1f1d66e6cb376f1d6f24bb656ae09c84a2199d1" gracePeriod=30 Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.191605 4885 generic.go:334] "Generic (PLEG): container finished" podID="067e647c-7401-4dd7-9245-94d1675f1bb6" containerID="b657377210aa955017cbaa63c4b6f5cbdc53d16343057b6205a4890d542736e2" exitCode=0 Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.193170 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d424-account-create-update-jc98w" event={"ID":"067e647c-7401-4dd7-9245-94d1675f1bb6","Type":"ContainerDied","Data":"b657377210aa955017cbaa63c4b6f5cbdc53d16343057b6205a4890d542736e2"} Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.227947 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.624450424 podStartE2EDuration="5.227927561s" podCreationTimestamp="2025-12-05 20:25:32 +0000 UTC" firstStartedPulling="2025-12-05 20:25:33.185842918 +0000 UTC m=+1198.482658569" lastFinishedPulling="2025-12-05 20:25:36.789320045 +0000 UTC m=+1202.086135706" observedRunningTime="2025-12-05 20:25:37.224309308 +0000 UTC m=+1202.521124959" watchObservedRunningTime="2025-12-05 20:25:37.227927561 +0000 UTC m=+1202.524743222" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.542957 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4tgwl" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.578004 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txphl\" (UniqueName: \"kubernetes.io/projected/ed6ed529-d71f-4427-b906-ec6d3e9c33f0-kube-api-access-txphl\") pod \"ed6ed529-d71f-4427-b906-ec6d3e9c33f0\" (UID: \"ed6ed529-d71f-4427-b906-ec6d3e9c33f0\") " Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.578064 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed6ed529-d71f-4427-b906-ec6d3e9c33f0-operator-scripts\") pod \"ed6ed529-d71f-4427-b906-ec6d3e9c33f0\" (UID: \"ed6ed529-d71f-4427-b906-ec6d3e9c33f0\") " Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.579162 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed6ed529-d71f-4427-b906-ec6d3e9c33f0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed6ed529-d71f-4427-b906-ec6d3e9c33f0" (UID: "ed6ed529-d71f-4427-b906-ec6d3e9c33f0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.585264 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed6ed529-d71f-4427-b906-ec6d3e9c33f0-kube-api-access-txphl" (OuterVolumeSpecName: "kube-api-access-txphl") pod "ed6ed529-d71f-4427-b906-ec6d3e9c33f0" (UID: "ed6ed529-d71f-4427-b906-ec6d3e9c33f0"). InnerVolumeSpecName "kube-api-access-txphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.611075 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e423-account-create-update-4n6sv" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.635371 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-cn68w" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.679497 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llnwv\" (UniqueName: \"kubernetes.io/projected/91fa7e5b-9ed9-44de-bd54-105f4608ddb6-kube-api-access-llnwv\") pod \"91fa7e5b-9ed9-44de-bd54-105f4608ddb6\" (UID: \"91fa7e5b-9ed9-44de-bd54-105f4608ddb6\") " Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.679566 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ccgf\" (UniqueName: \"kubernetes.io/projected/8f88ff47-91a3-4fb9-9526-cc39661cbeec-kube-api-access-9ccgf\") pod \"8f88ff47-91a3-4fb9-9526-cc39661cbeec\" (UID: \"8f88ff47-91a3-4fb9-9526-cc39661cbeec\") " Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.679664 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91fa7e5b-9ed9-44de-bd54-105f4608ddb6-operator-scripts\") pod \"91fa7e5b-9ed9-44de-bd54-105f4608ddb6\" (UID: \"91fa7e5b-9ed9-44de-bd54-105f4608ddb6\") " Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.679841 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f88ff47-91a3-4fb9-9526-cc39661cbeec-operator-scripts\") pod \"8f88ff47-91a3-4fb9-9526-cc39661cbeec\" (UID: \"8f88ff47-91a3-4fb9-9526-cc39661cbeec\") " Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.680352 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txphl\" (UniqueName: \"kubernetes.io/projected/ed6ed529-d71f-4427-b906-ec6d3e9c33f0-kube-api-access-txphl\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.680373 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed6ed529-d71f-4427-b906-ec6d3e9c33f0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.680366 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91fa7e5b-9ed9-44de-bd54-105f4608ddb6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "91fa7e5b-9ed9-44de-bd54-105f4608ddb6" (UID: "91fa7e5b-9ed9-44de-bd54-105f4608ddb6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.680866 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f88ff47-91a3-4fb9-9526-cc39661cbeec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f88ff47-91a3-4fb9-9526-cc39661cbeec" (UID: "8f88ff47-91a3-4fb9-9526-cc39661cbeec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.683238 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f88ff47-91a3-4fb9-9526-cc39661cbeec-kube-api-access-9ccgf" (OuterVolumeSpecName: "kube-api-access-9ccgf") pod "8f88ff47-91a3-4fb9-9526-cc39661cbeec" (UID: "8f88ff47-91a3-4fb9-9526-cc39661cbeec"). InnerVolumeSpecName "kube-api-access-9ccgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.683286 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91fa7e5b-9ed9-44de-bd54-105f4608ddb6-kube-api-access-llnwv" (OuterVolumeSpecName: "kube-api-access-llnwv") pod "91fa7e5b-9ed9-44de-bd54-105f4608ddb6" (UID: "91fa7e5b-9ed9-44de-bd54-105f4608ddb6"). InnerVolumeSpecName "kube-api-access-llnwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.781711 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f88ff47-91a3-4fb9-9526-cc39661cbeec-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.782032 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llnwv\" (UniqueName: \"kubernetes.io/projected/91fa7e5b-9ed9-44de-bd54-105f4608ddb6-kube-api-access-llnwv\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.782110 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ccgf\" (UniqueName: \"kubernetes.io/projected/8f88ff47-91a3-4fb9-9526-cc39661cbeec-kube-api-access-9ccgf\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:37 crc kubenswrapper[4885]: I1205 20:25:37.782173 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91fa7e5b-9ed9-44de-bd54-105f4608ddb6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.201269 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e423-account-create-update-4n6sv" event={"ID":"8f88ff47-91a3-4fb9-9526-cc39661cbeec","Type":"ContainerDied","Data":"43ae3bb29322928719a650e3490dd715f99cf2a05870d3f1d4e867cd081b6a19"} Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.201892 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43ae3bb29322928719a650e3490dd715f99cf2a05870d3f1d4e867cd081b6a19" Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.201301 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e423-account-create-update-4n6sv" Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.202400 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4tgwl" event={"ID":"ed6ed529-d71f-4427-b906-ec6d3e9c33f0","Type":"ContainerDied","Data":"4785a6c65811e986e8adfcc1b3f2e864d4fb91a27a22c0133750891d61bb486d"} Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.202428 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4785a6c65811e986e8adfcc1b3f2e864d4fb91a27a22c0133750891d61bb486d" Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.202487 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4tgwl" Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.210920 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-cn68w" Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.210948 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-cn68w" event={"ID":"91fa7e5b-9ed9-44de-bd54-105f4608ddb6","Type":"ContainerDied","Data":"dd86b15925d057bb6fe8bd7c8f8b3a4ddd178c0baf715be9652ab39b9ea97c70"} Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.210994 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd86b15925d057bb6fe8bd7c8f8b3a4ddd178c0baf715be9652ab39b9ea97c70" Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.215529 4885 generic.go:334] "Generic (PLEG): container finished" podID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerID="844afc68efe31bca488eb22dda3d04c940fe5370c830f732a3314e6c59dadd6c" exitCode=2 Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.215702 4885 generic.go:334] "Generic (PLEG): container finished" podID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerID="d42f20830f1129dad133bcf7c1f1d66e6cb376f1d6f24bb656ae09c84a2199d1" exitCode=0 Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.215824 4885 generic.go:334] "Generic (PLEG): container finished" podID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerID="c6f72940037f69966fec82ea83d08b1b90d7c3fd3db14a2741ca5acd161a70b4" exitCode=0 Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.215575 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2988bfcd-6a34-4a15-8a36-953ca658c25b","Type":"ContainerDied","Data":"844afc68efe31bca488eb22dda3d04c940fe5370c830f732a3314e6c59dadd6c"} Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.215991 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2988bfcd-6a34-4a15-8a36-953ca658c25b","Type":"ContainerDied","Data":"d42f20830f1129dad133bcf7c1f1d66e6cb376f1d6f24bb656ae09c84a2199d1"} Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.216011 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2988bfcd-6a34-4a15-8a36-953ca658c25b","Type":"ContainerDied","Data":"c6f72940037f69966fec82ea83d08b1b90d7c3fd3db14a2741ca5acd161a70b4"} Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.710299 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k4lkf" Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.824109 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/607fd1c0-165f-465f-bdd4-134ab3451a51-operator-scripts\") pod \"607fd1c0-165f-465f-bdd4-134ab3451a51\" (UID: \"607fd1c0-165f-465f-bdd4-134ab3451a51\") " Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.824267 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx4nj\" (UniqueName: \"kubernetes.io/projected/607fd1c0-165f-465f-bdd4-134ab3451a51-kube-api-access-qx4nj\") pod \"607fd1c0-165f-465f-bdd4-134ab3451a51\" (UID: \"607fd1c0-165f-465f-bdd4-134ab3451a51\") " Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.824757 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/607fd1c0-165f-465f-bdd4-134ab3451a51-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "607fd1c0-165f-465f-bdd4-134ab3451a51" (UID: "607fd1c0-165f-465f-bdd4-134ab3451a51"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.824936 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/607fd1c0-165f-465f-bdd4-134ab3451a51-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.830289 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/607fd1c0-165f-465f-bdd4-134ab3451a51-kube-api-access-qx4nj" (OuterVolumeSpecName: "kube-api-access-qx4nj") pod "607fd1c0-165f-465f-bdd4-134ab3451a51" (UID: "607fd1c0-165f-465f-bdd4-134ab3451a51"). InnerVolumeSpecName "kube-api-access-qx4nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.872115 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.872359 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="adef1acf-bcea-43fb-a6ad-e4fea6b24643" containerName="glance-log" containerID="cri-o://718b07ee517bcbc2f813f76df0a609506d09f57622f85d61de7d11107b153421" gracePeriod=30 Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.872494 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="adef1acf-bcea-43fb-a6ad-e4fea6b24643" containerName="glance-httpd" containerID="cri-o://2d3307ebd18ff387a0240c326fb657960c0a774cb4ed54ed7b096e1a936acbd9" gracePeriod=30 Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.926252 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx4nj\" (UniqueName: \"kubernetes.io/projected/607fd1c0-165f-465f-bdd4-134ab3451a51-kube-api-access-qx4nj\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.927171 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7478-account-create-update-7whgg" Dec 05 20:25:38 crc kubenswrapper[4885]: I1205 20:25:38.936057 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d424-account-create-update-jc98w" Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.133931 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6zdz\" (UniqueName: \"kubernetes.io/projected/067e647c-7401-4dd7-9245-94d1675f1bb6-kube-api-access-q6zdz\") pod \"067e647c-7401-4dd7-9245-94d1675f1bb6\" (UID: \"067e647c-7401-4dd7-9245-94d1675f1bb6\") " Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.134011 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsnc5\" (UniqueName: \"kubernetes.io/projected/4e7ef23d-578c-43d5-b7eb-a15cefb90d03-kube-api-access-jsnc5\") pod \"4e7ef23d-578c-43d5-b7eb-a15cefb90d03\" (UID: \"4e7ef23d-578c-43d5-b7eb-a15cefb90d03\") " Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.134169 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/067e647c-7401-4dd7-9245-94d1675f1bb6-operator-scripts\") pod \"067e647c-7401-4dd7-9245-94d1675f1bb6\" (UID: \"067e647c-7401-4dd7-9245-94d1675f1bb6\") " Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.134231 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e7ef23d-578c-43d5-b7eb-a15cefb90d03-operator-scripts\") pod \"4e7ef23d-578c-43d5-b7eb-a15cefb90d03\" (UID: \"4e7ef23d-578c-43d5-b7eb-a15cefb90d03\") " Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.134576 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/067e647c-7401-4dd7-9245-94d1675f1bb6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "067e647c-7401-4dd7-9245-94d1675f1bb6" (UID: "067e647c-7401-4dd7-9245-94d1675f1bb6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.134666 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e7ef23d-578c-43d5-b7eb-a15cefb90d03-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e7ef23d-578c-43d5-b7eb-a15cefb90d03" (UID: "4e7ef23d-578c-43d5-b7eb-a15cefb90d03"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.134739 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/067e647c-7401-4dd7-9245-94d1675f1bb6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.137174 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/067e647c-7401-4dd7-9245-94d1675f1bb6-kube-api-access-q6zdz" (OuterVolumeSpecName: "kube-api-access-q6zdz") pod "067e647c-7401-4dd7-9245-94d1675f1bb6" (UID: "067e647c-7401-4dd7-9245-94d1675f1bb6"). InnerVolumeSpecName "kube-api-access-q6zdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.137647 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e7ef23d-578c-43d5-b7eb-a15cefb90d03-kube-api-access-jsnc5" (OuterVolumeSpecName: "kube-api-access-jsnc5") pod "4e7ef23d-578c-43d5-b7eb-a15cefb90d03" (UID: "4e7ef23d-578c-43d5-b7eb-a15cefb90d03"). InnerVolumeSpecName "kube-api-access-jsnc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.227213 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d424-account-create-update-jc98w" event={"ID":"067e647c-7401-4dd7-9245-94d1675f1bb6","Type":"ContainerDied","Data":"658a7d7be1332c16c31805a804fef9c1036cf10fe1d7f8d14050756327a7db60"} Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.227261 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="658a7d7be1332c16c31805a804fef9c1036cf10fe1d7f8d14050756327a7db60" Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.227229 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d424-account-create-update-jc98w" Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.229371 4885 generic.go:334] "Generic (PLEG): container finished" podID="adef1acf-bcea-43fb-a6ad-e4fea6b24643" containerID="718b07ee517bcbc2f813f76df0a609506d09f57622f85d61de7d11107b153421" exitCode=143 Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.229425 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"adef1acf-bcea-43fb-a6ad-e4fea6b24643","Type":"ContainerDied","Data":"718b07ee517bcbc2f813f76df0a609506d09f57622f85d61de7d11107b153421"} Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.232000 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7478-account-create-update-7whgg" event={"ID":"4e7ef23d-578c-43d5-b7eb-a15cefb90d03","Type":"ContainerDied","Data":"80b8999c0668b1027376f7aa419ede72180e2e7ef89819db0d9e40edc3aad5ec"} Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.232039 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80b8999c0668b1027376f7aa419ede72180e2e7ef89819db0d9e40edc3aad5ec" Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.232069 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7478-account-create-update-7whgg" Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.234037 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k4lkf" event={"ID":"607fd1c0-165f-465f-bdd4-134ab3451a51","Type":"ContainerDied","Data":"cca0b0e20705a038f05eade167686884379f5895b1531e824092991015276904"} Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.234075 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cca0b0e20705a038f05eade167686884379f5895b1531e824092991015276904" Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.234189 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k4lkf" Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.236685 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e7ef23d-578c-43d5-b7eb-a15cefb90d03-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.236716 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6zdz\" (UniqueName: \"kubernetes.io/projected/067e647c-7401-4dd7-9245-94d1675f1bb6-kube-api-access-q6zdz\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:39 crc kubenswrapper[4885]: I1205 20:25:39.236731 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsnc5\" (UniqueName: \"kubernetes.io/projected/4e7ef23d-578c-43d5-b7eb-a15cefb90d03-kube-api-access-jsnc5\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.272733 4885 generic.go:334] "Generic (PLEG): container finished" podID="6a2ee42f-a754-4128-a568-f321de7b1beb" containerID="7029cca0d3d40d2220c1c94a687d5932e7c88807b0abab2d26a4166e2917d62f" exitCode=0 Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.273104 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a2ee42f-a754-4128-a568-f321de7b1beb","Type":"ContainerDied","Data":"7029cca0d3d40d2220c1c94a687d5932e7c88807b0abab2d26a4166e2917d62f"} Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.626585 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.806591 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7wrx\" (UniqueName: \"kubernetes.io/projected/6a2ee42f-a754-4128-a568-f321de7b1beb-kube-api-access-w7wrx\") pod \"6a2ee42f-a754-4128-a568-f321de7b1beb\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.806652 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-scripts\") pod \"6a2ee42f-a754-4128-a568-f321de7b1beb\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.806712 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-combined-ca-bundle\") pod \"6a2ee42f-a754-4128-a568-f321de7b1beb\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.806735 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-public-tls-certs\") pod \"6a2ee42f-a754-4128-a568-f321de7b1beb\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.806770 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2ee42f-a754-4128-a568-f321de7b1beb-logs\") pod \"6a2ee42f-a754-4128-a568-f321de7b1beb\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.806796 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"6a2ee42f-a754-4128-a568-f321de7b1beb\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.806863 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a2ee42f-a754-4128-a568-f321de7b1beb-httpd-run\") pod \"6a2ee42f-a754-4128-a568-f321de7b1beb\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.806973 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-config-data\") pod \"6a2ee42f-a754-4128-a568-f321de7b1beb\" (UID: \"6a2ee42f-a754-4128-a568-f321de7b1beb\") " Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.807549 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a2ee42f-a754-4128-a568-f321de7b1beb-logs" (OuterVolumeSpecName: "logs") pod "6a2ee42f-a754-4128-a568-f321de7b1beb" (UID: "6a2ee42f-a754-4128-a568-f321de7b1beb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.808010 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a2ee42f-a754-4128-a568-f321de7b1beb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6a2ee42f-a754-4128-a568-f321de7b1beb" (UID: "6a2ee42f-a754-4128-a568-f321de7b1beb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.815344 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a2ee42f-a754-4128-a568-f321de7b1beb-kube-api-access-w7wrx" (OuterVolumeSpecName: "kube-api-access-w7wrx") pod "6a2ee42f-a754-4128-a568-f321de7b1beb" (UID: "6a2ee42f-a754-4128-a568-f321de7b1beb"). InnerVolumeSpecName "kube-api-access-w7wrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.821232 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-scripts" (OuterVolumeSpecName: "scripts") pod "6a2ee42f-a754-4128-a568-f321de7b1beb" (UID: "6a2ee42f-a754-4128-a568-f321de7b1beb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.821321 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "6a2ee42f-a754-4128-a568-f321de7b1beb" (UID: "6a2ee42f-a754-4128-a568-f321de7b1beb"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.846375 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a2ee42f-a754-4128-a568-f321de7b1beb" (UID: "6a2ee42f-a754-4128-a568-f321de7b1beb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.864914 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6a2ee42f-a754-4128-a568-f321de7b1beb" (UID: "6a2ee42f-a754-4128-a568-f321de7b1beb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.867215 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-config-data" (OuterVolumeSpecName: "config-data") pod "6a2ee42f-a754-4128-a568-f321de7b1beb" (UID: "6a2ee42f-a754-4128-a568-f321de7b1beb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.908732 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.908766 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7wrx\" (UniqueName: \"kubernetes.io/projected/6a2ee42f-a754-4128-a568-f321de7b1beb-kube-api-access-w7wrx\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.908776 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.908785 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.908794 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a2ee42f-a754-4128-a568-f321de7b1beb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.908804 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2ee42f-a754-4128-a568-f321de7b1beb-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.908832 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.908844 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a2ee42f-a754-4128-a568-f321de7b1beb-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:40 crc kubenswrapper[4885]: I1205 20:25:40.929454 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.010617 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.283486 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a2ee42f-a754-4128-a568-f321de7b1beb","Type":"ContainerDied","Data":"70973bd42e9dabfc5e7aed5042044e739a5298534c7fc49813cd0bc2d79faaa2"} Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.283545 4885 scope.go:117] "RemoveContainer" containerID="7029cca0d3d40d2220c1c94a687d5932e7c88807b0abab2d26a4166e2917d62f" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.283689 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.314401 4885 scope.go:117] "RemoveContainer" containerID="6ab208155a9cb0552587aea741d5d3637c7f1b625991327a2e713b8532ee3134" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.322076 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.330080 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.366208 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:25:41 crc kubenswrapper[4885]: E1205 20:25:41.366768 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e7ef23d-578c-43d5-b7eb-a15cefb90d03" containerName="mariadb-account-create-update" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.366800 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e7ef23d-578c-43d5-b7eb-a15cefb90d03" containerName="mariadb-account-create-update" Dec 05 20:25:41 crc kubenswrapper[4885]: E1205 20:25:41.366830 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067e647c-7401-4dd7-9245-94d1675f1bb6" containerName="mariadb-account-create-update" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.366843 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="067e647c-7401-4dd7-9245-94d1675f1bb6" containerName="mariadb-account-create-update" Dec 05 20:25:41 crc kubenswrapper[4885]: E1205 20:25:41.366860 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91fa7e5b-9ed9-44de-bd54-105f4608ddb6" containerName="mariadb-database-create" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.366871 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fa7e5b-9ed9-44de-bd54-105f4608ddb6" containerName="mariadb-database-create" Dec 05 20:25:41 crc kubenswrapper[4885]: E1205 20:25:41.366899 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6ed529-d71f-4427-b906-ec6d3e9c33f0" containerName="mariadb-database-create" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.366910 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6ed529-d71f-4427-b906-ec6d3e9c33f0" containerName="mariadb-database-create" Dec 05 20:25:41 crc kubenswrapper[4885]: E1205 20:25:41.366945 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2ee42f-a754-4128-a568-f321de7b1beb" containerName="glance-httpd" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.366955 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2ee42f-a754-4128-a568-f321de7b1beb" containerName="glance-httpd" Dec 05 20:25:41 crc kubenswrapper[4885]: E1205 20:25:41.366978 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607fd1c0-165f-465f-bdd4-134ab3451a51" containerName="mariadb-database-create" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.366990 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="607fd1c0-165f-465f-bdd4-134ab3451a51" containerName="mariadb-database-create" Dec 05 20:25:41 crc kubenswrapper[4885]: E1205 20:25:41.367012 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f88ff47-91a3-4fb9-9526-cc39661cbeec" containerName="mariadb-account-create-update" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.367118 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f88ff47-91a3-4fb9-9526-cc39661cbeec" containerName="mariadb-account-create-update" Dec 05 20:25:41 crc kubenswrapper[4885]: E1205 20:25:41.367145 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2ee42f-a754-4128-a568-f321de7b1beb" containerName="glance-log" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.367157 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2ee42f-a754-4128-a568-f321de7b1beb" containerName="glance-log" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.367501 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a2ee42f-a754-4128-a568-f321de7b1beb" containerName="glance-httpd" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.367524 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="607fd1c0-165f-465f-bdd4-134ab3451a51" containerName="mariadb-database-create" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.367540 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="91fa7e5b-9ed9-44de-bd54-105f4608ddb6" containerName="mariadb-database-create" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.367550 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f88ff47-91a3-4fb9-9526-cc39661cbeec" containerName="mariadb-account-create-update" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.367560 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e7ef23d-578c-43d5-b7eb-a15cefb90d03" containerName="mariadb-account-create-update" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.367583 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a2ee42f-a754-4128-a568-f321de7b1beb" containerName="glance-log" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.367600 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="067e647c-7401-4dd7-9245-94d1675f1bb6" containerName="mariadb-account-create-update" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.367618 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed6ed529-d71f-4427-b906-ec6d3e9c33f0" containerName="mariadb-database-create" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.368864 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.370920 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.371145 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.386074 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.523393 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-logs\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.523754 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.523790 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.523819 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.523886 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.524040 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-scripts\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.524066 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-config-data\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.524099 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndrfd\" (UniqueName: \"kubernetes.io/projected/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-kube-api-access-ndrfd\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.625722 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.625846 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-scripts\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.625881 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-config-data\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.625915 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndrfd\" (UniqueName: \"kubernetes.io/projected/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-kube-api-access-ndrfd\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.625956 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-logs\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.625974 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.626002 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.626047 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.626244 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.626873 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.627117 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-logs\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.630427 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.631811 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-config-data\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.637181 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-scripts\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.638222 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.647737 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndrfd\" (UniqueName: \"kubernetes.io/projected/c88a6c22-ae9a-4d43-9a63-e6ea351eb012-kube-api-access-ndrfd\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.666379 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"c88a6c22-ae9a-4d43-9a63-e6ea351eb012\") " pod="openstack/glance-default-external-api-0" Dec 05 20:25:41 crc kubenswrapper[4885]: I1205 20:25:41.686488 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.203209 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.305323 4885 generic.go:334] "Generic (PLEG): container finished" podID="adef1acf-bcea-43fb-a6ad-e4fea6b24643" containerID="2d3307ebd18ff387a0240c326fb657960c0a774cb4ed54ed7b096e1a936acbd9" exitCode=0 Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.305398 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"adef1acf-bcea-43fb-a6ad-e4fea6b24643","Type":"ContainerDied","Data":"2d3307ebd18ff387a0240c326fb657960c0a774cb4ed54ed7b096e1a936acbd9"} Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.308360 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c88a6c22-ae9a-4d43-9a63-e6ea351eb012","Type":"ContainerStarted","Data":"4c3f11a811b61464525d7f697d31b46d59466eb139850c17cdb75c38d2d35ec2"} Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.518995 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.642047 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-internal-tls-certs\") pod \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.642121 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adef1acf-bcea-43fb-a6ad-e4fea6b24643-logs\") pod \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.642146 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgtcs\" (UniqueName: \"kubernetes.io/projected/adef1acf-bcea-43fb-a6ad-e4fea6b24643-kube-api-access-lgtcs\") pod \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.642190 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-combined-ca-bundle\") pod \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.642212 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-config-data\") pod \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.642315 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/adef1acf-bcea-43fb-a6ad-e4fea6b24643-httpd-run\") pod \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.642456 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-scripts\") pod \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.642473 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\" (UID: \"adef1acf-bcea-43fb-a6ad-e4fea6b24643\") " Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.642628 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adef1acf-bcea-43fb-a6ad-e4fea6b24643-logs" (OuterVolumeSpecName: "logs") pod "adef1acf-bcea-43fb-a6ad-e4fea6b24643" (UID: "adef1acf-bcea-43fb-a6ad-e4fea6b24643"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.642882 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adef1acf-bcea-43fb-a6ad-e4fea6b24643-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "adef1acf-bcea-43fb-a6ad-e4fea6b24643" (UID: "adef1acf-bcea-43fb-a6ad-e4fea6b24643"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.643519 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adef1acf-bcea-43fb-a6ad-e4fea6b24643-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.643536 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/adef1acf-bcea-43fb-a6ad-e4fea6b24643-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.650263 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "adef1acf-bcea-43fb-a6ad-e4fea6b24643" (UID: "adef1acf-bcea-43fb-a6ad-e4fea6b24643"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.650448 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-scripts" (OuterVolumeSpecName: "scripts") pod "adef1acf-bcea-43fb-a6ad-e4fea6b24643" (UID: "adef1acf-bcea-43fb-a6ad-e4fea6b24643"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.653761 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adef1acf-bcea-43fb-a6ad-e4fea6b24643-kube-api-access-lgtcs" (OuterVolumeSpecName: "kube-api-access-lgtcs") pod "adef1acf-bcea-43fb-a6ad-e4fea6b24643" (UID: "adef1acf-bcea-43fb-a6ad-e4fea6b24643"). InnerVolumeSpecName "kube-api-access-lgtcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.720557 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "adef1acf-bcea-43fb-a6ad-e4fea6b24643" (UID: "adef1acf-bcea-43fb-a6ad-e4fea6b24643"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.731727 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-config-data" (OuterVolumeSpecName: "config-data") pod "adef1acf-bcea-43fb-a6ad-e4fea6b24643" (UID: "adef1acf-bcea-43fb-a6ad-e4fea6b24643"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.745307 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.745345 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.745355 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.745386 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.745396 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgtcs\" (UniqueName: \"kubernetes.io/projected/adef1acf-bcea-43fb-a6ad-e4fea6b24643-kube-api-access-lgtcs\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.763349 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "adef1acf-bcea-43fb-a6ad-e4fea6b24643" (UID: "adef1acf-bcea-43fb-a6ad-e4fea6b24643"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.773526 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.846810 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:42 crc kubenswrapper[4885]: I1205 20:25:42.846870 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/adef1acf-bcea-43fb-a6ad-e4fea6b24643-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.140389 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.189104 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a2ee42f-a754-4128-a568-f321de7b1beb" path="/var/lib/kubelet/pods/6a2ee42f-a754-4128-a568-f321de7b1beb/volumes" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.252907 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-logs\") pod \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.252959 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-scripts\") pod \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.253093 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fcc2\" (UniqueName: \"kubernetes.io/projected/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-kube-api-access-7fcc2\") pod \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.253189 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-combined-ca-bundle\") pod \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.253247 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-horizon-secret-key\") pod \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.253307 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-horizon-tls-certs\") pod \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.253353 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-config-data\") pod \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\" (UID: \"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a\") " Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.255895 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-logs" (OuterVolumeSpecName: "logs") pod "58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" (UID: "58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.258677 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" (UID: "58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.262791 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-kube-api-access-7fcc2" (OuterVolumeSpecName: "kube-api-access-7fcc2") pod "58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" (UID: "58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a"). InnerVolumeSpecName "kube-api-access-7fcc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.287535 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-config-data" (OuterVolumeSpecName: "config-data") pod "58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" (UID: "58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.291004 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-scripts" (OuterVolumeSpecName: "scripts") pod "58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" (UID: "58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.293230 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" (UID: "58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.321451 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" (UID: "58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.340431 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"adef1acf-bcea-43fb-a6ad-e4fea6b24643","Type":"ContainerDied","Data":"b04b247389d6f41f39a5662af344993783aeaf9355c9d6fa9e7bcae64e652c12"} Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.340520 4885 scope.go:117] "RemoveContainer" containerID="2d3307ebd18ff387a0240c326fb657960c0a774cb4ed54ed7b096e1a936acbd9" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.340556 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.345322 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c88a6c22-ae9a-4d43-9a63-e6ea351eb012","Type":"ContainerStarted","Data":"b923646d7750abb45c2fa6bb18566952f7abb683eca2903b897e1a686b9712dc"} Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.348471 4885 generic.go:334] "Generic (PLEG): container finished" podID="58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" containerID="c4d686985a3af471508ab1b5d0a4c3ed14ad0ec2a8a4399057c6c1c976215e97" exitCode=137 Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.348518 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ddb869454-vvfd9" event={"ID":"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a","Type":"ContainerDied","Data":"c4d686985a3af471508ab1b5d0a4c3ed14ad0ec2a8a4399057c6c1c976215e97"} Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.348545 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ddb869454-vvfd9" event={"ID":"58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a","Type":"ContainerDied","Data":"83f06635eceaaf2d00f9fd79d4ff304fde25ce7920a71f925b0717b58d2cf852"} Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.348597 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7ddb869454-vvfd9" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.362340 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.362369 4885 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.362385 4885 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.362397 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.362409 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.362420 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.362431 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fcc2\" (UniqueName: \"kubernetes.io/projected/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a-kube-api-access-7fcc2\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.384779 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.398475 4885 scope.go:117] "RemoveContainer" containerID="718b07ee517bcbc2f813f76df0a609506d09f57622f85d61de7d11107b153421" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.401165 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.432534 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7ddb869454-vvfd9"] Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.439882 4885 scope.go:117] "RemoveContainer" containerID="7d4897e7e9fe34f5c8e863c727990aaf2e3ffa96de3ab3cb8b2927f061b528b5" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.440722 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7ddb869454-vvfd9"] Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.448061 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:25:43 crc kubenswrapper[4885]: E1205 20:25:43.448459 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" containerName="horizon-log" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.448475 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" containerName="horizon-log" Dec 05 20:25:43 crc kubenswrapper[4885]: E1205 20:25:43.448485 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" containerName="horizon" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.448492 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" containerName="horizon" Dec 05 20:25:43 crc kubenswrapper[4885]: E1205 20:25:43.448505 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adef1acf-bcea-43fb-a6ad-e4fea6b24643" containerName="glance-httpd" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.448511 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="adef1acf-bcea-43fb-a6ad-e4fea6b24643" containerName="glance-httpd" Dec 05 20:25:43 crc kubenswrapper[4885]: E1205 20:25:43.448532 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adef1acf-bcea-43fb-a6ad-e4fea6b24643" containerName="glance-log" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.448539 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="adef1acf-bcea-43fb-a6ad-e4fea6b24643" containerName="glance-log" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.448703 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" containerName="horizon-log" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.448719 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" containerName="horizon" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.448830 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="adef1acf-bcea-43fb-a6ad-e4fea6b24643" containerName="glance-log" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.448847 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="adef1acf-bcea-43fb-a6ad-e4fea6b24643" containerName="glance-httpd" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.450267 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.452740 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.452947 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.459542 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.567579 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.568316 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.568480 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-logs\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.568581 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.568775 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.568912 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.569638 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp4qr\" (UniqueName: \"kubernetes.io/projected/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-kube-api-access-bp4qr\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.569733 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.626744 4885 scope.go:117] "RemoveContainer" containerID="c4d686985a3af471508ab1b5d0a4c3ed14ad0ec2a8a4399057c6c1c976215e97" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.671410 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.672092 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.672140 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-logs\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.672160 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.672184 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.672229 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.672249 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp4qr\" (UniqueName: \"kubernetes.io/projected/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-kube-api-access-bp4qr\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.672260 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.672272 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.673213 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.673270 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-logs\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.679365 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.682863 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.685345 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.685875 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.694800 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp4qr\" (UniqueName: \"kubernetes.io/projected/44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d-kube-api-access-bp4qr\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.705048 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d\") " pod="openstack/glance-default-internal-api-0" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.784168 4885 scope.go:117] "RemoveContainer" containerID="7d4897e7e9fe34f5c8e863c727990aaf2e3ffa96de3ab3cb8b2927f061b528b5" Dec 05 20:25:43 crc kubenswrapper[4885]: E1205 20:25:43.784594 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d4897e7e9fe34f5c8e863c727990aaf2e3ffa96de3ab3cb8b2927f061b528b5\": container with ID starting with 7d4897e7e9fe34f5c8e863c727990aaf2e3ffa96de3ab3cb8b2927f061b528b5 not found: ID does not exist" containerID="7d4897e7e9fe34f5c8e863c727990aaf2e3ffa96de3ab3cb8b2927f061b528b5" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.784639 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d4897e7e9fe34f5c8e863c727990aaf2e3ffa96de3ab3cb8b2927f061b528b5"} err="failed to get container status \"7d4897e7e9fe34f5c8e863c727990aaf2e3ffa96de3ab3cb8b2927f061b528b5\": rpc error: code = NotFound desc = could not find container \"7d4897e7e9fe34f5c8e863c727990aaf2e3ffa96de3ab3cb8b2927f061b528b5\": container with ID starting with 7d4897e7e9fe34f5c8e863c727990aaf2e3ffa96de3ab3cb8b2927f061b528b5 not found: ID does not exist" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.784681 4885 scope.go:117] "RemoveContainer" containerID="c4d686985a3af471508ab1b5d0a4c3ed14ad0ec2a8a4399057c6c1c976215e97" Dec 05 20:25:43 crc kubenswrapper[4885]: E1205 20:25:43.785046 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4d686985a3af471508ab1b5d0a4c3ed14ad0ec2a8a4399057c6c1c976215e97\": container with ID starting with c4d686985a3af471508ab1b5d0a4c3ed14ad0ec2a8a4399057c6c1c976215e97 not found: ID does not exist" containerID="c4d686985a3af471508ab1b5d0a4c3ed14ad0ec2a8a4399057c6c1c976215e97" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.785143 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d686985a3af471508ab1b5d0a4c3ed14ad0ec2a8a4399057c6c1c976215e97"} err="failed to get container status \"c4d686985a3af471508ab1b5d0a4c3ed14ad0ec2a8a4399057c6c1c976215e97\": rpc error: code = NotFound desc = could not find container \"c4d686985a3af471508ab1b5d0a4c3ed14ad0ec2a8a4399057c6c1c976215e97\": container with ID starting with c4d686985a3af471508ab1b5d0a4c3ed14ad0ec2a8a4399057c6c1c976215e97 not found: ID does not exist" Dec 05 20:25:43 crc kubenswrapper[4885]: I1205 20:25:43.793753 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.325665 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.358520 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d","Type":"ContainerStarted","Data":"66a4d0b0e822eb71562cbc645afdfaa313087702fa13e8a22f1baac2e7cceece"} Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.362126 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c88a6c22-ae9a-4d43-9a63-e6ea351eb012","Type":"ContainerStarted","Data":"f7d22c889d077d06e51f7f5a6e9d2481887e3a44273b0bd84626c4e68a760724"} Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.385381 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.385366063 podStartE2EDuration="3.385366063s" podCreationTimestamp="2025-12-05 20:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:25:44.380643706 +0000 UTC m=+1209.677459367" watchObservedRunningTime="2025-12-05 20:25:44.385366063 +0000 UTC m=+1209.682181724" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.644363 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rwl9m"] Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.645660 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rwl9m" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.649688 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vmp6w" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.650674 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.650827 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.659936 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rwl9m"] Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.794523 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-config-data\") pod \"nova-cell0-conductor-db-sync-rwl9m\" (UID: \"28e595c7-7f03-4290-94df-5a3177b31c16\") " pod="openstack/nova-cell0-conductor-db-sync-rwl9m" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.794604 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rwl9m\" (UID: \"28e595c7-7f03-4290-94df-5a3177b31c16\") " pod="openstack/nova-cell0-conductor-db-sync-rwl9m" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.794752 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-scripts\") pod \"nova-cell0-conductor-db-sync-rwl9m\" (UID: \"28e595c7-7f03-4290-94df-5a3177b31c16\") " pod="openstack/nova-cell0-conductor-db-sync-rwl9m" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.794886 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzzzk\" (UniqueName: \"kubernetes.io/projected/28e595c7-7f03-4290-94df-5a3177b31c16-kube-api-access-jzzzk\") pod \"nova-cell0-conductor-db-sync-rwl9m\" (UID: \"28e595c7-7f03-4290-94df-5a3177b31c16\") " pod="openstack/nova-cell0-conductor-db-sync-rwl9m" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.896883 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-config-data\") pod \"nova-cell0-conductor-db-sync-rwl9m\" (UID: \"28e595c7-7f03-4290-94df-5a3177b31c16\") " pod="openstack/nova-cell0-conductor-db-sync-rwl9m" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.896992 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rwl9m\" (UID: \"28e595c7-7f03-4290-94df-5a3177b31c16\") " pod="openstack/nova-cell0-conductor-db-sync-rwl9m" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.897064 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-scripts\") pod \"nova-cell0-conductor-db-sync-rwl9m\" (UID: \"28e595c7-7f03-4290-94df-5a3177b31c16\") " pod="openstack/nova-cell0-conductor-db-sync-rwl9m" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.897116 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzzzk\" (UniqueName: \"kubernetes.io/projected/28e595c7-7f03-4290-94df-5a3177b31c16-kube-api-access-jzzzk\") pod \"nova-cell0-conductor-db-sync-rwl9m\" (UID: \"28e595c7-7f03-4290-94df-5a3177b31c16\") " pod="openstack/nova-cell0-conductor-db-sync-rwl9m" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.903317 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-scripts\") pod \"nova-cell0-conductor-db-sync-rwl9m\" (UID: \"28e595c7-7f03-4290-94df-5a3177b31c16\") " pod="openstack/nova-cell0-conductor-db-sync-rwl9m" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.903565 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-config-data\") pod \"nova-cell0-conductor-db-sync-rwl9m\" (UID: \"28e595c7-7f03-4290-94df-5a3177b31c16\") " pod="openstack/nova-cell0-conductor-db-sync-rwl9m" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.913269 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rwl9m\" (UID: \"28e595c7-7f03-4290-94df-5a3177b31c16\") " pod="openstack/nova-cell0-conductor-db-sync-rwl9m" Dec 05 20:25:44 crc kubenswrapper[4885]: I1205 20:25:44.930476 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzzzk\" (UniqueName: \"kubernetes.io/projected/28e595c7-7f03-4290-94df-5a3177b31c16-kube-api-access-jzzzk\") pod \"nova-cell0-conductor-db-sync-rwl9m\" (UID: \"28e595c7-7f03-4290-94df-5a3177b31c16\") " pod="openstack/nova-cell0-conductor-db-sync-rwl9m" Dec 05 20:25:45 crc kubenswrapper[4885]: I1205 20:25:45.001497 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rwl9m" Dec 05 20:25:45 crc kubenswrapper[4885]: I1205 20:25:45.230124 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a" path="/var/lib/kubelet/pods/58faa50e-ede0-4c8e-ad2d-d76b7e1feb2a/volumes" Dec 05 20:25:45 crc kubenswrapper[4885]: I1205 20:25:45.232818 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adef1acf-bcea-43fb-a6ad-e4fea6b24643" path="/var/lib/kubelet/pods/adef1acf-bcea-43fb-a6ad-e4fea6b24643/volumes" Dec 05 20:25:45 crc kubenswrapper[4885]: I1205 20:25:45.394908 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d","Type":"ContainerStarted","Data":"7c5a9bc4b71cefba726e03a146914aa2c71c1a9934dd913cc864743756842a7b"} Dec 05 20:25:45 crc kubenswrapper[4885]: I1205 20:25:45.608010 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rwl9m"] Dec 05 20:25:45 crc kubenswrapper[4885]: W1205 20:25:45.616497 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28e595c7_7f03_4290_94df_5a3177b31c16.slice/crio-30c6b3ed44727067fdaa80a755d9b829dc38eea746e7298eae90775a6d675c14 WatchSource:0}: Error finding container 30c6b3ed44727067fdaa80a755d9b829dc38eea746e7298eae90775a6d675c14: Status 404 returned error can't find the container with id 30c6b3ed44727067fdaa80a755d9b829dc38eea746e7298eae90775a6d675c14 Dec 05 20:25:46 crc kubenswrapper[4885]: I1205 20:25:46.410105 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d","Type":"ContainerStarted","Data":"715560a978e90a3abccb9ed0120428037fd595342e99b89ca64f1acbd22e4a6c"} Dec 05 20:25:46 crc kubenswrapper[4885]: I1205 20:25:46.412974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rwl9m" event={"ID":"28e595c7-7f03-4290-94df-5a3177b31c16","Type":"ContainerStarted","Data":"30c6b3ed44727067fdaa80a755d9b829dc38eea746e7298eae90775a6d675c14"} Dec 05 20:25:46 crc kubenswrapper[4885]: I1205 20:25:46.448871 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.448848584 podStartE2EDuration="3.448848584s" podCreationTimestamp="2025-12-05 20:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:25:46.429469839 +0000 UTC m=+1211.726285500" watchObservedRunningTime="2025-12-05 20:25:46.448848584 +0000 UTC m=+1211.745664245" Dec 05 20:25:51 crc kubenswrapper[4885]: I1205 20:25:51.687285 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 20:25:51 crc kubenswrapper[4885]: I1205 20:25:51.687901 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 20:25:51 crc kubenswrapper[4885]: I1205 20:25:51.728929 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 20:25:51 crc kubenswrapper[4885]: I1205 20:25:51.764922 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 20:25:52 crc kubenswrapper[4885]: I1205 20:25:52.464667 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 20:25:52 crc kubenswrapper[4885]: I1205 20:25:52.465254 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 20:25:53 crc kubenswrapper[4885]: I1205 20:25:53.474115 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rwl9m" event={"ID":"28e595c7-7f03-4290-94df-5a3177b31c16","Type":"ContainerStarted","Data":"d6082b550762079a1f6e3eff5f27e3cb12fea5804fab76fa63d6d0513b45530d"} Dec 05 20:25:53 crc kubenswrapper[4885]: I1205 20:25:53.494266 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-rwl9m" podStartSLOduration=2.680429395 podStartE2EDuration="9.494242101s" podCreationTimestamp="2025-12-05 20:25:44 +0000 UTC" firstStartedPulling="2025-12-05 20:25:45.619248187 +0000 UTC m=+1210.916063848" lastFinishedPulling="2025-12-05 20:25:52.433060893 +0000 UTC m=+1217.729876554" observedRunningTime="2025-12-05 20:25:53.488473662 +0000 UTC m=+1218.785289323" watchObservedRunningTime="2025-12-05 20:25:53.494242101 +0000 UTC m=+1218.791057762" Dec 05 20:25:53 crc kubenswrapper[4885]: I1205 20:25:53.803941 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:53 crc kubenswrapper[4885]: I1205 20:25:53.804329 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:53 crc kubenswrapper[4885]: I1205 20:25:53.835978 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:53 crc kubenswrapper[4885]: I1205 20:25:53.852358 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:54 crc kubenswrapper[4885]: I1205 20:25:54.384052 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 20:25:54 crc kubenswrapper[4885]: I1205 20:25:54.429955 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 20:25:54 crc kubenswrapper[4885]: I1205 20:25:54.484633 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:54 crc kubenswrapper[4885]: I1205 20:25:54.484673 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:56 crc kubenswrapper[4885]: I1205 20:25:56.410822 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 20:25:56 crc kubenswrapper[4885]: I1205 20:25:56.421990 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 20:26:02 crc kubenswrapper[4885]: I1205 20:26:02.558483 4885 generic.go:334] "Generic (PLEG): container finished" podID="28e595c7-7f03-4290-94df-5a3177b31c16" containerID="d6082b550762079a1f6e3eff5f27e3cb12fea5804fab76fa63d6d0513b45530d" exitCode=0 Dec 05 20:26:02 crc kubenswrapper[4885]: I1205 20:26:02.558575 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rwl9m" event={"ID":"28e595c7-7f03-4290-94df-5a3177b31c16","Type":"ContainerDied","Data":"d6082b550762079a1f6e3eff5f27e3cb12fea5804fab76fa63d6d0513b45530d"} Dec 05 20:26:02 crc kubenswrapper[4885]: I1205 20:26:02.759218 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 05 20:26:03 crc kubenswrapper[4885]: I1205 20:26:03.916081 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rwl9m" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.029809 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzzzk\" (UniqueName: \"kubernetes.io/projected/28e595c7-7f03-4290-94df-5a3177b31c16-kube-api-access-jzzzk\") pod \"28e595c7-7f03-4290-94df-5a3177b31c16\" (UID: \"28e595c7-7f03-4290-94df-5a3177b31c16\") " Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.030935 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-combined-ca-bundle\") pod \"28e595c7-7f03-4290-94df-5a3177b31c16\" (UID: \"28e595c7-7f03-4290-94df-5a3177b31c16\") " Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.031006 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-scripts\") pod \"28e595c7-7f03-4290-94df-5a3177b31c16\" (UID: \"28e595c7-7f03-4290-94df-5a3177b31c16\") " Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.031925 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-config-data\") pod \"28e595c7-7f03-4290-94df-5a3177b31c16\" (UID: \"28e595c7-7f03-4290-94df-5a3177b31c16\") " Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.194717 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-scripts" (OuterVolumeSpecName: "scripts") pod "28e595c7-7f03-4290-94df-5a3177b31c16" (UID: "28e595c7-7f03-4290-94df-5a3177b31c16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.194815 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e595c7-7f03-4290-94df-5a3177b31c16-kube-api-access-jzzzk" (OuterVolumeSpecName: "kube-api-access-jzzzk") pod "28e595c7-7f03-4290-94df-5a3177b31c16" (UID: "28e595c7-7f03-4290-94df-5a3177b31c16"). InnerVolumeSpecName "kube-api-access-jzzzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.201612 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28e595c7-7f03-4290-94df-5a3177b31c16" (UID: "28e595c7-7f03-4290-94df-5a3177b31c16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.209827 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-config-data" (OuterVolumeSpecName: "config-data") pod "28e595c7-7f03-4290-94df-5a3177b31c16" (UID: "28e595c7-7f03-4290-94df-5a3177b31c16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.256810 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.256860 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzzzk\" (UniqueName: \"kubernetes.io/projected/28e595c7-7f03-4290-94df-5a3177b31c16-kube-api-access-jzzzk\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.256882 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.256899 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28e595c7-7f03-4290-94df-5a3177b31c16-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.587747 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rwl9m" event={"ID":"28e595c7-7f03-4290-94df-5a3177b31c16","Type":"ContainerDied","Data":"30c6b3ed44727067fdaa80a755d9b829dc38eea746e7298eae90775a6d675c14"} Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.587795 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30c6b3ed44727067fdaa80a755d9b829dc38eea746e7298eae90775a6d675c14" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.587871 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rwl9m" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.688471 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 20:26:04 crc kubenswrapper[4885]: E1205 20:26:04.689387 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e595c7-7f03-4290-94df-5a3177b31c16" containerName="nova-cell0-conductor-db-sync" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.689493 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e595c7-7f03-4290-94df-5a3177b31c16" containerName="nova-cell0-conductor-db-sync" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.689720 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e595c7-7f03-4290-94df-5a3177b31c16" containerName="nova-cell0-conductor-db-sync" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.690413 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.692580 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vmp6w" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.693434 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.722752 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.866976 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f6973a-6753-4845-a273-798f031cf4d6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b8f6973a-6753-4845-a273-798f031cf4d6\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.867115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f6973a-6753-4845-a273-798f031cf4d6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b8f6973a-6753-4845-a273-798f031cf4d6\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.867153 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db55g\" (UniqueName: \"kubernetes.io/projected/b8f6973a-6753-4845-a273-798f031cf4d6-kube-api-access-db55g\") pod \"nova-cell0-conductor-0\" (UID: \"b8f6973a-6753-4845-a273-798f031cf4d6\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.968699 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f6973a-6753-4845-a273-798f031cf4d6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b8f6973a-6753-4845-a273-798f031cf4d6\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.968750 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db55g\" (UniqueName: \"kubernetes.io/projected/b8f6973a-6753-4845-a273-798f031cf4d6-kube-api-access-db55g\") pod \"nova-cell0-conductor-0\" (UID: \"b8f6973a-6753-4845-a273-798f031cf4d6\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.968863 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f6973a-6753-4845-a273-798f031cf4d6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b8f6973a-6753-4845-a273-798f031cf4d6\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.973534 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f6973a-6753-4845-a273-798f031cf4d6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b8f6973a-6753-4845-a273-798f031cf4d6\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.974388 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f6973a-6753-4845-a273-798f031cf4d6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b8f6973a-6753-4845-a273-798f031cf4d6\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:26:04 crc kubenswrapper[4885]: I1205 20:26:04.988958 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db55g\" (UniqueName: \"kubernetes.io/projected/b8f6973a-6753-4845-a273-798f031cf4d6-kube-api-access-db55g\") pod \"nova-cell0-conductor-0\" (UID: \"b8f6973a-6753-4845-a273-798f031cf4d6\") " pod="openstack/nova-cell0-conductor-0" Dec 05 20:26:05 crc kubenswrapper[4885]: I1205 20:26:05.007261 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 20:26:05 crc kubenswrapper[4885]: I1205 20:26:05.490458 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 20:26:05 crc kubenswrapper[4885]: I1205 20:26:05.600238 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b8f6973a-6753-4845-a273-798f031cf4d6","Type":"ContainerStarted","Data":"521899a9201b5c443e589a42efbe9455cee575ce9347270942eafdcc6dcf99c7"} Dec 05 20:26:06 crc kubenswrapper[4885]: I1205 20:26:06.611968 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b8f6973a-6753-4845-a273-798f031cf4d6","Type":"ContainerStarted","Data":"1051ecdeb5616af89deb7f1246e8009b8e23c987ca388a4d89eb2b5511af40e5"} Dec 05 20:26:06 crc kubenswrapper[4885]: I1205 20:26:06.612093 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 05 20:26:06 crc kubenswrapper[4885]: I1205 20:26:06.629262 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.629242636 podStartE2EDuration="2.629242636s" podCreationTimestamp="2025-12-05 20:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:26:06.628841804 +0000 UTC m=+1231.925657455" watchObservedRunningTime="2025-12-05 20:26:06.629242636 +0000 UTC m=+1231.926058297" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.612110 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.622742 4885 generic.go:334] "Generic (PLEG): container finished" podID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerID="e3b72add245eb7162f06781a119a37e6f9946a6e57946d6774855add5ef4ea8d" exitCode=137 Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.622784 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.622797 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2988bfcd-6a34-4a15-8a36-953ca658c25b","Type":"ContainerDied","Data":"e3b72add245eb7162f06781a119a37e6f9946a6e57946d6774855add5ef4ea8d"} Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.622861 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2988bfcd-6a34-4a15-8a36-953ca658c25b","Type":"ContainerDied","Data":"dc8154d04962bf7d3ea895a0b5eae8568474085393521daa75fa73b00eebcbff"} Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.622885 4885 scope.go:117] "RemoveContainer" containerID="e3b72add245eb7162f06781a119a37e6f9946a6e57946d6774855add5ef4ea8d" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.652909 4885 scope.go:117] "RemoveContainer" containerID="844afc68efe31bca488eb22dda3d04c940fe5370c830f732a3314e6c59dadd6c" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.676095 4885 scope.go:117] "RemoveContainer" containerID="d42f20830f1129dad133bcf7c1f1d66e6cb376f1d6f24bb656ae09c84a2199d1" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.703032 4885 scope.go:117] "RemoveContainer" containerID="c6f72940037f69966fec82ea83d08b1b90d7c3fd3db14a2741ca5acd161a70b4" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.729448 4885 scope.go:117] "RemoveContainer" containerID="e3b72add245eb7162f06781a119a37e6f9946a6e57946d6774855add5ef4ea8d" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.730379 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2988bfcd-6a34-4a15-8a36-953ca658c25b-run-httpd\") pod \"2988bfcd-6a34-4a15-8a36-953ca658c25b\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.730453 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-combined-ca-bundle\") pod \"2988bfcd-6a34-4a15-8a36-953ca658c25b\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.730511 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2988bfcd-6a34-4a15-8a36-953ca658c25b-log-httpd\") pod \"2988bfcd-6a34-4a15-8a36-953ca658c25b\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.730545 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-sg-core-conf-yaml\") pod \"2988bfcd-6a34-4a15-8a36-953ca658c25b\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.730593 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqh9d\" (UniqueName: \"kubernetes.io/projected/2988bfcd-6a34-4a15-8a36-953ca658c25b-kube-api-access-lqh9d\") pod \"2988bfcd-6a34-4a15-8a36-953ca658c25b\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " Dec 05 20:26:07 crc kubenswrapper[4885]: E1205 20:26:07.730508 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3b72add245eb7162f06781a119a37e6f9946a6e57946d6774855add5ef4ea8d\": container with ID starting with e3b72add245eb7162f06781a119a37e6f9946a6e57946d6774855add5ef4ea8d not found: ID does not exist" containerID="e3b72add245eb7162f06781a119a37e6f9946a6e57946d6774855add5ef4ea8d" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.730661 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-config-data\") pod \"2988bfcd-6a34-4a15-8a36-953ca658c25b\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.730665 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b72add245eb7162f06781a119a37e6f9946a6e57946d6774855add5ef4ea8d"} err="failed to get container status \"e3b72add245eb7162f06781a119a37e6f9946a6e57946d6774855add5ef4ea8d\": rpc error: code = NotFound desc = could not find container \"e3b72add245eb7162f06781a119a37e6f9946a6e57946d6774855add5ef4ea8d\": container with ID starting with e3b72add245eb7162f06781a119a37e6f9946a6e57946d6774855add5ef4ea8d not found: ID does not exist" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.730704 4885 scope.go:117] "RemoveContainer" containerID="844afc68efe31bca488eb22dda3d04c940fe5370c830f732a3314e6c59dadd6c" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.730690 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-scripts\") pod \"2988bfcd-6a34-4a15-8a36-953ca658c25b\" (UID: \"2988bfcd-6a34-4a15-8a36-953ca658c25b\") " Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.731196 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2988bfcd-6a34-4a15-8a36-953ca658c25b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2988bfcd-6a34-4a15-8a36-953ca658c25b" (UID: "2988bfcd-6a34-4a15-8a36-953ca658c25b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.731631 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2988bfcd-6a34-4a15-8a36-953ca658c25b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2988bfcd-6a34-4a15-8a36-953ca658c25b" (UID: "2988bfcd-6a34-4a15-8a36-953ca658c25b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.731699 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2988bfcd-6a34-4a15-8a36-953ca658c25b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:07 crc kubenswrapper[4885]: E1205 20:26:07.732469 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"844afc68efe31bca488eb22dda3d04c940fe5370c830f732a3314e6c59dadd6c\": container with ID starting with 844afc68efe31bca488eb22dda3d04c940fe5370c830f732a3314e6c59dadd6c not found: ID does not exist" containerID="844afc68efe31bca488eb22dda3d04c940fe5370c830f732a3314e6c59dadd6c" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.732493 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"844afc68efe31bca488eb22dda3d04c940fe5370c830f732a3314e6c59dadd6c"} err="failed to get container status \"844afc68efe31bca488eb22dda3d04c940fe5370c830f732a3314e6c59dadd6c\": rpc error: code = NotFound desc = could not find container \"844afc68efe31bca488eb22dda3d04c940fe5370c830f732a3314e6c59dadd6c\": container with ID starting with 844afc68efe31bca488eb22dda3d04c940fe5370c830f732a3314e6c59dadd6c not found: ID does not exist" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.732511 4885 scope.go:117] "RemoveContainer" containerID="d42f20830f1129dad133bcf7c1f1d66e6cb376f1d6f24bb656ae09c84a2199d1" Dec 05 20:26:07 crc kubenswrapper[4885]: E1205 20:26:07.732993 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d42f20830f1129dad133bcf7c1f1d66e6cb376f1d6f24bb656ae09c84a2199d1\": container with ID starting with d42f20830f1129dad133bcf7c1f1d66e6cb376f1d6f24bb656ae09c84a2199d1 not found: ID does not exist" containerID="d42f20830f1129dad133bcf7c1f1d66e6cb376f1d6f24bb656ae09c84a2199d1" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.733009 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42f20830f1129dad133bcf7c1f1d66e6cb376f1d6f24bb656ae09c84a2199d1"} err="failed to get container status \"d42f20830f1129dad133bcf7c1f1d66e6cb376f1d6f24bb656ae09c84a2199d1\": rpc error: code = NotFound desc = could not find container \"d42f20830f1129dad133bcf7c1f1d66e6cb376f1d6f24bb656ae09c84a2199d1\": container with ID starting with d42f20830f1129dad133bcf7c1f1d66e6cb376f1d6f24bb656ae09c84a2199d1 not found: ID does not exist" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.733034 4885 scope.go:117] "RemoveContainer" containerID="c6f72940037f69966fec82ea83d08b1b90d7c3fd3db14a2741ca5acd161a70b4" Dec 05 20:26:07 crc kubenswrapper[4885]: E1205 20:26:07.733269 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6f72940037f69966fec82ea83d08b1b90d7c3fd3db14a2741ca5acd161a70b4\": container with ID starting with c6f72940037f69966fec82ea83d08b1b90d7c3fd3db14a2741ca5acd161a70b4 not found: ID does not exist" containerID="c6f72940037f69966fec82ea83d08b1b90d7c3fd3db14a2741ca5acd161a70b4" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.733285 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6f72940037f69966fec82ea83d08b1b90d7c3fd3db14a2741ca5acd161a70b4"} err="failed to get container status \"c6f72940037f69966fec82ea83d08b1b90d7c3fd3db14a2741ca5acd161a70b4\": rpc error: code = NotFound desc = could not find container \"c6f72940037f69966fec82ea83d08b1b90d7c3fd3db14a2741ca5acd161a70b4\": container with ID starting with c6f72940037f69966fec82ea83d08b1b90d7c3fd3db14a2741ca5acd161a70b4 not found: ID does not exist" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.735922 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2988bfcd-6a34-4a15-8a36-953ca658c25b-kube-api-access-lqh9d" (OuterVolumeSpecName: "kube-api-access-lqh9d") pod "2988bfcd-6a34-4a15-8a36-953ca658c25b" (UID: "2988bfcd-6a34-4a15-8a36-953ca658c25b"). InnerVolumeSpecName "kube-api-access-lqh9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.741915 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-scripts" (OuterVolumeSpecName: "scripts") pod "2988bfcd-6a34-4a15-8a36-953ca658c25b" (UID: "2988bfcd-6a34-4a15-8a36-953ca658c25b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.759904 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2988bfcd-6a34-4a15-8a36-953ca658c25b" (UID: "2988bfcd-6a34-4a15-8a36-953ca658c25b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.802794 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2988bfcd-6a34-4a15-8a36-953ca658c25b" (UID: "2988bfcd-6a34-4a15-8a36-953ca658c25b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.827746 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-config-data" (OuterVolumeSpecName: "config-data") pod "2988bfcd-6a34-4a15-8a36-953ca658c25b" (UID: "2988bfcd-6a34-4a15-8a36-953ca658c25b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.833403 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2988bfcd-6a34-4a15-8a36-953ca658c25b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.833453 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.833477 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.833495 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqh9d\" (UniqueName: \"kubernetes.io/projected/2988bfcd-6a34-4a15-8a36-953ca658c25b-kube-api-access-lqh9d\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.833513 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.833529 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2988bfcd-6a34-4a15-8a36-953ca658c25b-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.973598 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.980784 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.990733 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:26:07 crc kubenswrapper[4885]: E1205 20:26:07.991101 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="proxy-httpd" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.991120 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="proxy-httpd" Dec 05 20:26:07 crc kubenswrapper[4885]: E1205 20:26:07.991171 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="ceilometer-notification-agent" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.991178 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="ceilometer-notification-agent" Dec 05 20:26:07 crc kubenswrapper[4885]: E1205 20:26:07.991196 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="ceilometer-central-agent" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.991204 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="ceilometer-central-agent" Dec 05 20:26:07 crc kubenswrapper[4885]: E1205 20:26:07.991222 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="sg-core" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.991228 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="sg-core" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.991389 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="sg-core" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.991411 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="ceilometer-central-agent" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.991427 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="ceilometer-notification-agent" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.991441 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" containerName="proxy-httpd" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.993193 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.995448 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 20:26:07 crc kubenswrapper[4885]: I1205 20:26:07.995663 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.024386 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.139325 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.139623 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.139645 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn8r7\" (UniqueName: \"kubernetes.io/projected/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-kube-api-access-hn8r7\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.139685 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-log-httpd\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.139718 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-config-data\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.139751 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-scripts\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.139791 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-run-httpd\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.241687 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-log-httpd\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.241825 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-config-data\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.241936 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-scripts\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.242119 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-run-httpd\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.242215 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-log-httpd\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.242346 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.242372 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.242411 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn8r7\" (UniqueName: \"kubernetes.io/projected/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-kube-api-access-hn8r7\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.242943 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-run-httpd\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.250226 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.250613 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-scripts\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.252808 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.265490 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-config-data\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.287955 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn8r7\" (UniqueName: \"kubernetes.io/projected/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-kube-api-access-hn8r7\") pod \"ceilometer-0\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.312002 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:26:08 crc kubenswrapper[4885]: W1205 20:26:08.789217 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6ad9815_1330_4d91_aeab_4bb6540bd8bf.slice/crio-cc149308871c4c0a0b9a8cbe38acb71f893920eb66c1f9f43cf51c157dcb6b74 WatchSource:0}: Error finding container cc149308871c4c0a0b9a8cbe38acb71f893920eb66c1f9f43cf51c157dcb6b74: Status 404 returned error can't find the container with id cc149308871c4c0a0b9a8cbe38acb71f893920eb66c1f9f43cf51c157dcb6b74 Dec 05 20:26:08 crc kubenswrapper[4885]: I1205 20:26:08.795513 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:26:09 crc kubenswrapper[4885]: I1205 20:26:09.182817 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2988bfcd-6a34-4a15-8a36-953ca658c25b" path="/var/lib/kubelet/pods/2988bfcd-6a34-4a15-8a36-953ca658c25b/volumes" Dec 05 20:26:09 crc kubenswrapper[4885]: I1205 20:26:09.655941 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ad9815-1330-4d91-aeab-4bb6540bd8bf","Type":"ContainerStarted","Data":"e41f111392e604df32d47b83bac1edfb7317e5efe6a6496250a619df930a3ebc"} Dec 05 20:26:09 crc kubenswrapper[4885]: I1205 20:26:09.656288 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ad9815-1330-4d91-aeab-4bb6540bd8bf","Type":"ContainerStarted","Data":"cc149308871c4c0a0b9a8cbe38acb71f893920eb66c1f9f43cf51c157dcb6b74"} Dec 05 20:26:14 crc kubenswrapper[4885]: I1205 20:26:14.701807 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ad9815-1330-4d91-aeab-4bb6540bd8bf","Type":"ContainerStarted","Data":"64001c8f95977d9d006aa1bdf92e0186a77e6651db187862588d6edb57d88429"} Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.104638 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.663582 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-9qst6"] Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.665327 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9qst6" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.667758 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.667854 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.679413 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9qst6"] Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.713845 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ad9815-1330-4d91-aeab-4bb6540bd8bf","Type":"ContainerStarted","Data":"99426b8a91ce79a2fbf126d13e9637a7ee46de85c53a0fd3b3cb46e6fff324b0"} Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.778607 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnk6j\" (UniqueName: \"kubernetes.io/projected/e31833f3-c584-4352-bf8c-03e18def1ea2-kube-api-access-vnk6j\") pod \"nova-cell0-cell-mapping-9qst6\" (UID: \"e31833f3-c584-4352-bf8c-03e18def1ea2\") " pod="openstack/nova-cell0-cell-mapping-9qst6" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.778816 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9qst6\" (UID: \"e31833f3-c584-4352-bf8c-03e18def1ea2\") " pod="openstack/nova-cell0-cell-mapping-9qst6" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.778933 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-scripts\") pod \"nova-cell0-cell-mapping-9qst6\" (UID: \"e31833f3-c584-4352-bf8c-03e18def1ea2\") " pod="openstack/nova-cell0-cell-mapping-9qst6" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.779002 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-config-data\") pod \"nova-cell0-cell-mapping-9qst6\" (UID: \"e31833f3-c584-4352-bf8c-03e18def1ea2\") " pod="openstack/nova-cell0-cell-mapping-9qst6" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.867173 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.868266 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.872662 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.878501 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.883502 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-config-data\") pod \"nova-cell0-cell-mapping-9qst6\" (UID: \"e31833f3-c584-4352-bf8c-03e18def1ea2\") " pod="openstack/nova-cell0-cell-mapping-9qst6" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.883606 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnk6j\" (UniqueName: \"kubernetes.io/projected/e31833f3-c584-4352-bf8c-03e18def1ea2-kube-api-access-vnk6j\") pod \"nova-cell0-cell-mapping-9qst6\" (UID: \"e31833f3-c584-4352-bf8c-03e18def1ea2\") " pod="openstack/nova-cell0-cell-mapping-9qst6" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.883709 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9qst6\" (UID: \"e31833f3-c584-4352-bf8c-03e18def1ea2\") " pod="openstack/nova-cell0-cell-mapping-9qst6" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.883747 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-scripts\") pod \"nova-cell0-cell-mapping-9qst6\" (UID: \"e31833f3-c584-4352-bf8c-03e18def1ea2\") " pod="openstack/nova-cell0-cell-mapping-9qst6" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.893915 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-scripts\") pod \"nova-cell0-cell-mapping-9qst6\" (UID: \"e31833f3-c584-4352-bf8c-03e18def1ea2\") " pod="openstack/nova-cell0-cell-mapping-9qst6" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.895816 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-config-data\") pod \"nova-cell0-cell-mapping-9qst6\" (UID: \"e31833f3-c584-4352-bf8c-03e18def1ea2\") " pod="openstack/nova-cell0-cell-mapping-9qst6" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.922724 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9qst6\" (UID: \"e31833f3-c584-4352-bf8c-03e18def1ea2\") " pod="openstack/nova-cell0-cell-mapping-9qst6" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.936638 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnk6j\" (UniqueName: \"kubernetes.io/projected/e31833f3-c584-4352-bf8c-03e18def1ea2-kube-api-access-vnk6j\") pod \"nova-cell0-cell-mapping-9qst6\" (UID: \"e31833f3-c584-4352-bf8c-03e18def1ea2\") " pod="openstack/nova-cell0-cell-mapping-9qst6" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.980709 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.981850 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.987755 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.988997 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495ff886-38af-4072-b162-8dc68cb0a0ec-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"495ff886-38af-4072-b162-8dc68cb0a0ec\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.989046 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfhlm\" (UniqueName: \"kubernetes.io/projected/495ff886-38af-4072-b162-8dc68cb0a0ec-kube-api-access-tfhlm\") pod \"nova-cell1-novncproxy-0\" (UID: \"495ff886-38af-4072-b162-8dc68cb0a0ec\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.989102 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495ff886-38af-4072-b162-8dc68cb0a0ec-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"495ff886-38af-4072-b162-8dc68cb0a0ec\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.989313 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9qst6" Dec 05 20:26:15 crc kubenswrapper[4885]: I1205 20:26:15.989842 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.001077 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.007781 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.010104 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.019510 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.091316 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb70f635-5e81-4e4f-bce1-77298cfc9fab-logs\") pod \"nova-metadata-0\" (UID: \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\") " pod="openstack/nova-metadata-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.091548 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9qgr\" (UniqueName: \"kubernetes.io/projected/eb70f635-5e81-4e4f-bce1-77298cfc9fab-kube-api-access-z9qgr\") pod \"nova-metadata-0\" (UID: \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\") " pod="openstack/nova-metadata-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.091568 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1770387e-755d-445c-be41-29d372f71ba7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1770387e-755d-445c-be41-29d372f71ba7\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.091601 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495ff886-38af-4072-b162-8dc68cb0a0ec-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"495ff886-38af-4072-b162-8dc68cb0a0ec\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.091619 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb70f635-5e81-4e4f-bce1-77298cfc9fab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\") " pod="openstack/nova-metadata-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.091638 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb70f635-5e81-4e4f-bce1-77298cfc9fab-config-data\") pod \"nova-metadata-0\" (UID: \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\") " pod="openstack/nova-metadata-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.091704 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p49dj\" (UniqueName: \"kubernetes.io/projected/1770387e-755d-445c-be41-29d372f71ba7-kube-api-access-p49dj\") pod \"nova-scheduler-0\" (UID: \"1770387e-755d-445c-be41-29d372f71ba7\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.091736 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1770387e-755d-445c-be41-29d372f71ba7-config-data\") pod \"nova-scheduler-0\" (UID: \"1770387e-755d-445c-be41-29d372f71ba7\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.091767 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495ff886-38af-4072-b162-8dc68cb0a0ec-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"495ff886-38af-4072-b162-8dc68cb0a0ec\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.091795 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfhlm\" (UniqueName: \"kubernetes.io/projected/495ff886-38af-4072-b162-8dc68cb0a0ec-kube-api-access-tfhlm\") pod \"nova-cell1-novncproxy-0\" (UID: \"495ff886-38af-4072-b162-8dc68cb0a0ec\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.097443 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495ff886-38af-4072-b162-8dc68cb0a0ec-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"495ff886-38af-4072-b162-8dc68cb0a0ec\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.098584 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495ff886-38af-4072-b162-8dc68cb0a0ec-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"495ff886-38af-4072-b162-8dc68cb0a0ec\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.145391 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfhlm\" (UniqueName: \"kubernetes.io/projected/495ff886-38af-4072-b162-8dc68cb0a0ec-kube-api-access-tfhlm\") pod \"nova-cell1-novncproxy-0\" (UID: \"495ff886-38af-4072-b162-8dc68cb0a0ec\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.193266 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p49dj\" (UniqueName: \"kubernetes.io/projected/1770387e-755d-445c-be41-29d372f71ba7-kube-api-access-p49dj\") pod \"nova-scheduler-0\" (UID: \"1770387e-755d-445c-be41-29d372f71ba7\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.193350 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1770387e-755d-445c-be41-29d372f71ba7-config-data\") pod \"nova-scheduler-0\" (UID: \"1770387e-755d-445c-be41-29d372f71ba7\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.197159 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb70f635-5e81-4e4f-bce1-77298cfc9fab-logs\") pod \"nova-metadata-0\" (UID: \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\") " pod="openstack/nova-metadata-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.197214 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9qgr\" (UniqueName: \"kubernetes.io/projected/eb70f635-5e81-4e4f-bce1-77298cfc9fab-kube-api-access-z9qgr\") pod \"nova-metadata-0\" (UID: \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\") " pod="openstack/nova-metadata-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.197254 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1770387e-755d-445c-be41-29d372f71ba7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1770387e-755d-445c-be41-29d372f71ba7\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.197326 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb70f635-5e81-4e4f-bce1-77298cfc9fab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\") " pod="openstack/nova-metadata-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.197355 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb70f635-5e81-4e4f-bce1-77298cfc9fab-config-data\") pod \"nova-metadata-0\" (UID: \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\") " pod="openstack/nova-metadata-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.197554 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb70f635-5e81-4e4f-bce1-77298cfc9fab-logs\") pod \"nova-metadata-0\" (UID: \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\") " pod="openstack/nova-metadata-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.237849 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1770387e-755d-445c-be41-29d372f71ba7-config-data\") pod \"nova-scheduler-0\" (UID: \"1770387e-755d-445c-be41-29d372f71ba7\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.238151 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b9ff45c7-dlf42"] Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.238516 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1770387e-755d-445c-be41-29d372f71ba7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1770387e-755d-445c-be41-29d372f71ba7\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.241721 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb70f635-5e81-4e4f-bce1-77298cfc9fab-config-data\") pod \"nova-metadata-0\" (UID: \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\") " pod="openstack/nova-metadata-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.243125 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p49dj\" (UniqueName: \"kubernetes.io/projected/1770387e-755d-445c-be41-29d372f71ba7-kube-api-access-p49dj\") pod \"nova-scheduler-0\" (UID: \"1770387e-755d-445c-be41-29d372f71ba7\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.245317 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9qgr\" (UniqueName: \"kubernetes.io/projected/eb70f635-5e81-4e4f-bce1-77298cfc9fab-kube-api-access-z9qgr\") pod \"nova-metadata-0\" (UID: \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\") " pod="openstack/nova-metadata-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.248111 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb70f635-5e81-4e4f-bce1-77298cfc9fab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\") " pod="openstack/nova-metadata-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.253793 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.285356 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.287690 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.293116 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.306114 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b9ff45c7-dlf42"] Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.318269 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.414185 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-config\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.414284 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-dns-swift-storage-0\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.414319 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c6473f-6f68-4841-96a6-cb9511da550e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b7c6473f-6f68-4841-96a6-cb9511da550e\") " pod="openstack/nova-api-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.414350 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-ovsdbserver-sb\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.414404 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-dns-svc\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.414431 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7c6473f-6f68-4841-96a6-cb9511da550e-logs\") pod \"nova-api-0\" (UID: \"b7c6473f-6f68-4841-96a6-cb9511da550e\") " pod="openstack/nova-api-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.414471 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7c6473f-6f68-4841-96a6-cb9511da550e-config-data\") pod \"nova-api-0\" (UID: \"b7c6473f-6f68-4841-96a6-cb9511da550e\") " pod="openstack/nova-api-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.414511 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb5p6\" (UniqueName: \"kubernetes.io/projected/0a873296-2fb6-42f4-b88b-30a8292bc14e-kube-api-access-sb5p6\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.414537 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-ovsdbserver-nb\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.414565 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljr8s\" (UniqueName: \"kubernetes.io/projected/b7c6473f-6f68-4841-96a6-cb9511da550e-kube-api-access-ljr8s\") pod \"nova-api-0\" (UID: \"b7c6473f-6f68-4841-96a6-cb9511da550e\") " pod="openstack/nova-api-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.439258 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.458738 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.497851 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.518743 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-dns-svc\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.518802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7c6473f-6f68-4841-96a6-cb9511da550e-logs\") pod \"nova-api-0\" (UID: \"b7c6473f-6f68-4841-96a6-cb9511da550e\") " pod="openstack/nova-api-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.518850 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7c6473f-6f68-4841-96a6-cb9511da550e-config-data\") pod \"nova-api-0\" (UID: \"b7c6473f-6f68-4841-96a6-cb9511da550e\") " pod="openstack/nova-api-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.518887 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb5p6\" (UniqueName: \"kubernetes.io/projected/0a873296-2fb6-42f4-b88b-30a8292bc14e-kube-api-access-sb5p6\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.518919 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-ovsdbserver-nb\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.518950 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljr8s\" (UniqueName: \"kubernetes.io/projected/b7c6473f-6f68-4841-96a6-cb9511da550e-kube-api-access-ljr8s\") pod \"nova-api-0\" (UID: \"b7c6473f-6f68-4841-96a6-cb9511da550e\") " pod="openstack/nova-api-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.519030 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-config\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.519086 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-dns-swift-storage-0\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.519115 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c6473f-6f68-4841-96a6-cb9511da550e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b7c6473f-6f68-4841-96a6-cb9511da550e\") " pod="openstack/nova-api-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.519145 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-ovsdbserver-sb\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.519597 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-dns-svc\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.519936 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-ovsdbserver-sb\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.520228 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-ovsdbserver-nb\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.520499 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7c6473f-6f68-4841-96a6-cb9511da550e-logs\") pod \"nova-api-0\" (UID: \"b7c6473f-6f68-4841-96a6-cb9511da550e\") " pod="openstack/nova-api-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.520598 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-config\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.521889 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-dns-swift-storage-0\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.525064 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c6473f-6f68-4841-96a6-cb9511da550e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b7c6473f-6f68-4841-96a6-cb9511da550e\") " pod="openstack/nova-api-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.533004 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7c6473f-6f68-4841-96a6-cb9511da550e-config-data\") pod \"nova-api-0\" (UID: \"b7c6473f-6f68-4841-96a6-cb9511da550e\") " pod="openstack/nova-api-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.538123 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljr8s\" (UniqueName: \"kubernetes.io/projected/b7c6473f-6f68-4841-96a6-cb9511da550e-kube-api-access-ljr8s\") pod \"nova-api-0\" (UID: \"b7c6473f-6f68-4841-96a6-cb9511da550e\") " pod="openstack/nova-api-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.552717 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb5p6\" (UniqueName: \"kubernetes.io/projected/0a873296-2fb6-42f4-b88b-30a8292bc14e-kube-api-access-sb5p6\") pod \"dnsmasq-dns-b9ff45c7-dlf42\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.581534 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.620906 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9qst6"] Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.660771 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.729408 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ad9815-1330-4d91-aeab-4bb6540bd8bf","Type":"ContainerStarted","Data":"82bd2550776c48980a4eadd10c0e1df19e101cb4a3ff0d65ea35a405e2624015"} Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.730689 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.753366 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.6985453980000003 podStartE2EDuration="9.75334829s" podCreationTimestamp="2025-12-05 20:26:07 +0000 UTC" firstStartedPulling="2025-12-05 20:26:08.793232036 +0000 UTC m=+1234.090047697" lastFinishedPulling="2025-12-05 20:26:15.848034928 +0000 UTC m=+1241.144850589" observedRunningTime="2025-12-05 20:26:16.750278826 +0000 UTC m=+1242.047094487" watchObservedRunningTime="2025-12-05 20:26:16.75334829 +0000 UTC m=+1242.050163951" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.861382 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bv27r"] Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.868108 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bv27r" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.876513 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.876779 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 05 20:26:16 crc kubenswrapper[4885]: I1205 20:26:16.878847 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bv27r"] Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.030737 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-scripts\") pod \"nova-cell1-conductor-db-sync-bv27r\" (UID: \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\") " pod="openstack/nova-cell1-conductor-db-sync-bv27r" Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.031080 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2csf\" (UniqueName: \"kubernetes.io/projected/ad8c8f0f-88d1-4be1-8db0-882fac969fce-kube-api-access-g2csf\") pod \"nova-cell1-conductor-db-sync-bv27r\" (UID: \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\") " pod="openstack/nova-cell1-conductor-db-sync-bv27r" Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.031161 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bv27r\" (UID: \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\") " pod="openstack/nova-cell1-conductor-db-sync-bv27r" Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.031184 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-config-data\") pod \"nova-cell1-conductor-db-sync-bv27r\" (UID: \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\") " pod="openstack/nova-cell1-conductor-db-sync-bv27r" Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.128577 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.133685 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bv27r\" (UID: \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\") " pod="openstack/nova-cell1-conductor-db-sync-bv27r" Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.133728 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-config-data\") pod \"nova-cell1-conductor-db-sync-bv27r\" (UID: \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\") " pod="openstack/nova-cell1-conductor-db-sync-bv27r" Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.133860 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-scripts\") pod \"nova-cell1-conductor-db-sync-bv27r\" (UID: \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\") " pod="openstack/nova-cell1-conductor-db-sync-bv27r" Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.133884 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2csf\" (UniqueName: \"kubernetes.io/projected/ad8c8f0f-88d1-4be1-8db0-882fac969fce-kube-api-access-g2csf\") pod \"nova-cell1-conductor-db-sync-bv27r\" (UID: \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\") " pod="openstack/nova-cell1-conductor-db-sync-bv27r" Dec 05 20:26:17 crc kubenswrapper[4885]: W1205 20:26:17.134912 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb70f635_5e81_4e4f_bce1_77298cfc9fab.slice/crio-3ce577ae748f8b4f44479c4f6a0315cd27cfd8cafda5508b3b2ad8822f8393ff WatchSource:0}: Error finding container 3ce577ae748f8b4f44479c4f6a0315cd27cfd8cafda5508b3b2ad8822f8393ff: Status 404 returned error can't find the container with id 3ce577ae748f8b4f44479c4f6a0315cd27cfd8cafda5508b3b2ad8822f8393ff Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.140459 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-scripts\") pod \"nova-cell1-conductor-db-sync-bv27r\" (UID: \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\") " pod="openstack/nova-cell1-conductor-db-sync-bv27r" Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.142175 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-config-data\") pod \"nova-cell1-conductor-db-sync-bv27r\" (UID: \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\") " pod="openstack/nova-cell1-conductor-db-sync-bv27r" Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.143175 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bv27r\" (UID: \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\") " pod="openstack/nova-cell1-conductor-db-sync-bv27r" Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.143524 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.154047 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2csf\" (UniqueName: \"kubernetes.io/projected/ad8c8f0f-88d1-4be1-8db0-882fac969fce-kube-api-access-g2csf\") pod \"nova-cell1-conductor-db-sync-bv27r\" (UID: \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\") " pod="openstack/nova-cell1-conductor-db-sync-bv27r" Dec 05 20:26:17 crc kubenswrapper[4885]: W1205 20:26:17.257154 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1770387e_755d_445c_be41_29d372f71ba7.slice/crio-42f241f6ce0072dbbcd74c74db7426c1b30dc653d63e9aa8bf7523f8287153d6 WatchSource:0}: Error finding container 42f241f6ce0072dbbcd74c74db7426c1b30dc653d63e9aa8bf7523f8287153d6: Status 404 returned error can't find the container with id 42f241f6ce0072dbbcd74c74db7426c1b30dc653d63e9aa8bf7523f8287153d6 Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.266728 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.318750 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bv27r" Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.358610 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b9ff45c7-dlf42"] Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.366499 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:26:17 crc kubenswrapper[4885]: W1205 20:26:17.373596 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a873296_2fb6_42f4_b88b_30a8292bc14e.slice/crio-26d33f997234f1dc529791db5dff3523fe6a8d946a62c3c417163b6e5d76e6a4 WatchSource:0}: Error finding container 26d33f997234f1dc529791db5dff3523fe6a8d946a62c3c417163b6e5d76e6a4: Status 404 returned error can't find the container with id 26d33f997234f1dc529791db5dff3523fe6a8d946a62c3c417163b6e5d76e6a4 Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.743483 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7c6473f-6f68-4841-96a6-cb9511da550e","Type":"ContainerStarted","Data":"6feb9281d57cfc6b7a737a95e353f9b9a0febb4e49ed6151662e5c1f1946f6c0"} Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.747497 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9qst6" event={"ID":"e31833f3-c584-4352-bf8c-03e18def1ea2","Type":"ContainerStarted","Data":"aed89c1191b75d1079c9c31b64b659a5796e3a63edfdbe2b5e6f4b582c388024"} Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.747524 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9qst6" event={"ID":"e31833f3-c584-4352-bf8c-03e18def1ea2","Type":"ContainerStarted","Data":"bf5231efd380da7fffcd79f5b25461e77358ca00e234c9151cc942d210375123"} Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.750230 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb70f635-5e81-4e4f-bce1-77298cfc9fab","Type":"ContainerStarted","Data":"3ce577ae748f8b4f44479c4f6a0315cd27cfd8cafda5508b3b2ad8822f8393ff"} Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.751690 4885 generic.go:334] "Generic (PLEG): container finished" podID="0a873296-2fb6-42f4-b88b-30a8292bc14e" containerID="7ab16f8f140013f50d130ab6e5099dceb8c6acd299e4208c1a8f177aa359d346" exitCode=0 Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.751737 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" event={"ID":"0a873296-2fb6-42f4-b88b-30a8292bc14e","Type":"ContainerDied","Data":"7ab16f8f140013f50d130ab6e5099dceb8c6acd299e4208c1a8f177aa359d346"} Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.751752 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" event={"ID":"0a873296-2fb6-42f4-b88b-30a8292bc14e","Type":"ContainerStarted","Data":"26d33f997234f1dc529791db5dff3523fe6a8d946a62c3c417163b6e5d76e6a4"} Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.758489 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1770387e-755d-445c-be41-29d372f71ba7","Type":"ContainerStarted","Data":"42f241f6ce0072dbbcd74c74db7426c1b30dc653d63e9aa8bf7523f8287153d6"} Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.760215 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"495ff886-38af-4072-b162-8dc68cb0a0ec","Type":"ContainerStarted","Data":"2b42fcbc7e777f336e66bef4a21ba1bfe730a459fb49dcf54197ccb0639f589c"} Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.775506 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-9qst6" podStartSLOduration=2.775490173 podStartE2EDuration="2.775490173s" podCreationTimestamp="2025-12-05 20:26:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:26:17.774911835 +0000 UTC m=+1243.071727506" watchObservedRunningTime="2025-12-05 20:26:17.775490173 +0000 UTC m=+1243.072305834" Dec 05 20:26:17 crc kubenswrapper[4885]: I1205 20:26:17.831512 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bv27r"] Dec 05 20:26:18 crc kubenswrapper[4885]: I1205 20:26:18.794544 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bv27r" event={"ID":"ad8c8f0f-88d1-4be1-8db0-882fac969fce","Type":"ContainerStarted","Data":"de1fc11cb66d09131a6dc49a451f0dbe994f12c1e351512c7a70089e2b3da346"} Dec 05 20:26:18 crc kubenswrapper[4885]: I1205 20:26:18.795197 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bv27r" event={"ID":"ad8c8f0f-88d1-4be1-8db0-882fac969fce","Type":"ContainerStarted","Data":"bde1092c5bc1bbf2bdeb60f530d644dc177daa70097dff94c89e34dfae6a21b5"} Dec 05 20:26:18 crc kubenswrapper[4885]: I1205 20:26:18.802979 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" event={"ID":"0a873296-2fb6-42f4-b88b-30a8292bc14e","Type":"ContainerStarted","Data":"b1fea74976e6df29d90d0e3f5be78c77a3cb060bb0d5fa21d448360903644735"} Dec 05 20:26:18 crc kubenswrapper[4885]: I1205 20:26:18.803217 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:18 crc kubenswrapper[4885]: I1205 20:26:18.810512 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-bv27r" podStartSLOduration=2.8104973319999997 podStartE2EDuration="2.810497332s" podCreationTimestamp="2025-12-05 20:26:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:26:18.808263243 +0000 UTC m=+1244.105078904" watchObservedRunningTime="2025-12-05 20:26:18.810497332 +0000 UTC m=+1244.107312993" Dec 05 20:26:18 crc kubenswrapper[4885]: I1205 20:26:18.831273 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" podStartSLOduration=2.83125691 podStartE2EDuration="2.83125691s" podCreationTimestamp="2025-12-05 20:26:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:26:18.825895265 +0000 UTC m=+1244.122710926" watchObservedRunningTime="2025-12-05 20:26:18.83125691 +0000 UTC m=+1244.128072571" Dec 05 20:26:19 crc kubenswrapper[4885]: I1205 20:26:19.256456 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:26:19 crc kubenswrapper[4885]: I1205 20:26:19.269984 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:26:20 crc kubenswrapper[4885]: I1205 20:26:20.837174 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"495ff886-38af-4072-b162-8dc68cb0a0ec","Type":"ContainerStarted","Data":"f82b909bd431b876d2961953000a5180693b43d5c801b7c774021a8a9eb880ac"} Dec 05 20:26:20 crc kubenswrapper[4885]: I1205 20:26:20.837369 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="495ff886-38af-4072-b162-8dc68cb0a0ec" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f82b909bd431b876d2961953000a5180693b43d5c801b7c774021a8a9eb880ac" gracePeriod=30 Dec 05 20:26:20 crc kubenswrapper[4885]: I1205 20:26:20.841249 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7c6473f-6f68-4841-96a6-cb9511da550e","Type":"ContainerStarted","Data":"106577e7cb6fb07faf89144149a2c61171649fddb80838e27da00a1cea0f760d"} Dec 05 20:26:20 crc kubenswrapper[4885]: I1205 20:26:20.843237 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb70f635-5e81-4e4f-bce1-77298cfc9fab","Type":"ContainerStarted","Data":"42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf"} Dec 05 20:26:20 crc kubenswrapper[4885]: I1205 20:26:20.844954 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1770387e-755d-445c-be41-29d372f71ba7","Type":"ContainerStarted","Data":"d5722fc8981da07faeb9b883282265a7a996b90159023f5d3a9d819066ca05ea"} Dec 05 20:26:20 crc kubenswrapper[4885]: I1205 20:26:20.864465 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.585560763 podStartE2EDuration="5.864443955s" podCreationTimestamp="2025-12-05 20:26:15 +0000 UTC" firstStartedPulling="2025-12-05 20:26:17.137822619 +0000 UTC m=+1242.434638280" lastFinishedPulling="2025-12-05 20:26:20.416705811 +0000 UTC m=+1245.713521472" observedRunningTime="2025-12-05 20:26:20.855257662 +0000 UTC m=+1246.152073353" watchObservedRunningTime="2025-12-05 20:26:20.864443955 +0000 UTC m=+1246.161259616" Dec 05 20:26:20 crc kubenswrapper[4885]: I1205 20:26:20.874456 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.715631513 podStartE2EDuration="5.874441213s" podCreationTimestamp="2025-12-05 20:26:15 +0000 UTC" firstStartedPulling="2025-12-05 20:26:17.259292434 +0000 UTC m=+1242.556108095" lastFinishedPulling="2025-12-05 20:26:20.418102134 +0000 UTC m=+1245.714917795" observedRunningTime="2025-12-05 20:26:20.872321087 +0000 UTC m=+1246.169136768" watchObservedRunningTime="2025-12-05 20:26:20.874441213 +0000 UTC m=+1246.171256874" Dec 05 20:26:21 crc kubenswrapper[4885]: I1205 20:26:21.440496 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:21 crc kubenswrapper[4885]: I1205 20:26:21.460337 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 20:26:21 crc kubenswrapper[4885]: I1205 20:26:21.855579 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb70f635-5e81-4e4f-bce1-77298cfc9fab","Type":"ContainerStarted","Data":"4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3"} Dec 05 20:26:21 crc kubenswrapper[4885]: I1205 20:26:21.855652 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="eb70f635-5e81-4e4f-bce1-77298cfc9fab" containerName="nova-metadata-log" containerID="cri-o://42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf" gracePeriod=30 Dec 05 20:26:21 crc kubenswrapper[4885]: I1205 20:26:21.855764 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="eb70f635-5e81-4e4f-bce1-77298cfc9fab" containerName="nova-metadata-metadata" containerID="cri-o://4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3" gracePeriod=30 Dec 05 20:26:21 crc kubenswrapper[4885]: I1205 20:26:21.861670 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7c6473f-6f68-4841-96a6-cb9511da550e","Type":"ContainerStarted","Data":"3b394c30ab9327b069f03fa539fa065b66dec6d6ed18fcd56a46cee57e52882a"} Dec 05 20:26:21 crc kubenswrapper[4885]: I1205 20:26:21.879346 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.598862157 podStartE2EDuration="6.879327606s" podCreationTimestamp="2025-12-05 20:26:15 +0000 UTC" firstStartedPulling="2025-12-05 20:26:17.138767789 +0000 UTC m=+1242.435583450" lastFinishedPulling="2025-12-05 20:26:20.419233238 +0000 UTC m=+1245.716048899" observedRunningTime="2025-12-05 20:26:21.875754665 +0000 UTC m=+1247.172570326" watchObservedRunningTime="2025-12-05 20:26:21.879327606 +0000 UTC m=+1247.176143267" Dec 05 20:26:21 crc kubenswrapper[4885]: I1205 20:26:21.900182 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.848577412 podStartE2EDuration="5.900160936s" podCreationTimestamp="2025-12-05 20:26:16 +0000 UTC" firstStartedPulling="2025-12-05 20:26:17.369374038 +0000 UTC m=+1242.666189699" lastFinishedPulling="2025-12-05 20:26:20.420957542 +0000 UTC m=+1245.717773223" observedRunningTime="2025-12-05 20:26:21.897283137 +0000 UTC m=+1247.194098798" watchObservedRunningTime="2025-12-05 20:26:21.900160936 +0000 UTC m=+1247.196976597" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.493000 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.550899 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9qgr\" (UniqueName: \"kubernetes.io/projected/eb70f635-5e81-4e4f-bce1-77298cfc9fab-kube-api-access-z9qgr\") pod \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\" (UID: \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\") " Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.551052 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb70f635-5e81-4e4f-bce1-77298cfc9fab-combined-ca-bundle\") pod \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\" (UID: \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\") " Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.551253 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb70f635-5e81-4e4f-bce1-77298cfc9fab-logs\") pod \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\" (UID: \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\") " Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.551558 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb70f635-5e81-4e4f-bce1-77298cfc9fab-logs" (OuterVolumeSpecName: "logs") pod "eb70f635-5e81-4e4f-bce1-77298cfc9fab" (UID: "eb70f635-5e81-4e4f-bce1-77298cfc9fab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.551625 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb70f635-5e81-4e4f-bce1-77298cfc9fab-config-data\") pod \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\" (UID: \"eb70f635-5e81-4e4f-bce1-77298cfc9fab\") " Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.552538 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb70f635-5e81-4e4f-bce1-77298cfc9fab-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.562162 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb70f635-5e81-4e4f-bce1-77298cfc9fab-kube-api-access-z9qgr" (OuterVolumeSpecName: "kube-api-access-z9qgr") pod "eb70f635-5e81-4e4f-bce1-77298cfc9fab" (UID: "eb70f635-5e81-4e4f-bce1-77298cfc9fab"). InnerVolumeSpecName "kube-api-access-z9qgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.587977 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb70f635-5e81-4e4f-bce1-77298cfc9fab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb70f635-5e81-4e4f-bce1-77298cfc9fab" (UID: "eb70f635-5e81-4e4f-bce1-77298cfc9fab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.590539 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb70f635-5e81-4e4f-bce1-77298cfc9fab-config-data" (OuterVolumeSpecName: "config-data") pod "eb70f635-5e81-4e4f-bce1-77298cfc9fab" (UID: "eb70f635-5e81-4e4f-bce1-77298cfc9fab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.654424 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb70f635-5e81-4e4f-bce1-77298cfc9fab-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.655835 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9qgr\" (UniqueName: \"kubernetes.io/projected/eb70f635-5e81-4e4f-bce1-77298cfc9fab-kube-api-access-z9qgr\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.655856 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb70f635-5e81-4e4f-bce1-77298cfc9fab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.871553 4885 generic.go:334] "Generic (PLEG): container finished" podID="eb70f635-5e81-4e4f-bce1-77298cfc9fab" containerID="4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3" exitCode=0 Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.871581 4885 generic.go:334] "Generic (PLEG): container finished" podID="eb70f635-5e81-4e4f-bce1-77298cfc9fab" containerID="42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf" exitCode=143 Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.871593 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb70f635-5e81-4e4f-bce1-77298cfc9fab","Type":"ContainerDied","Data":"4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3"} Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.871628 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb70f635-5e81-4e4f-bce1-77298cfc9fab","Type":"ContainerDied","Data":"42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf"} Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.871638 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb70f635-5e81-4e4f-bce1-77298cfc9fab","Type":"ContainerDied","Data":"3ce577ae748f8b4f44479c4f6a0315cd27cfd8cafda5508b3b2ad8822f8393ff"} Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.871652 4885 scope.go:117] "RemoveContainer" containerID="4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.871661 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.897150 4885 scope.go:117] "RemoveContainer" containerID="42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.917194 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.949218 4885 scope.go:117] "RemoveContainer" containerID="4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3" Dec 05 20:26:22 crc kubenswrapper[4885]: E1205 20:26:22.970544 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3\": container with ID starting with 4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3 not found: ID does not exist" containerID="4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.970590 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3"} err="failed to get container status \"4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3\": rpc error: code = NotFound desc = could not find container \"4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3\": container with ID starting with 4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3 not found: ID does not exist" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.970617 4885 scope.go:117] "RemoveContainer" containerID="42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf" Dec 05 20:26:22 crc kubenswrapper[4885]: E1205 20:26:22.973814 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf\": container with ID starting with 42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf not found: ID does not exist" containerID="42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.973860 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf"} err="failed to get container status \"42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf\": rpc error: code = NotFound desc = could not find container \"42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf\": container with ID starting with 42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf not found: ID does not exist" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.973925 4885 scope.go:117] "RemoveContainer" containerID="4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.974686 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3"} err="failed to get container status \"4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3\": rpc error: code = NotFound desc = could not find container \"4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3\": container with ID starting with 4db32689601d027ce95864e74523e518c2894d89679b5389dcaaac20ba32c6e3 not found: ID does not exist" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.974711 4885 scope.go:117] "RemoveContainer" containerID="42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.974998 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf"} err="failed to get container status \"42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf\": rpc error: code = NotFound desc = could not find container \"42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf\": container with ID starting with 42327aac42789f7cde5f680df83c4f8ff47722c352425cab862e4533d98270cf not found: ID does not exist" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.976362 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.992332 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:26:22 crc kubenswrapper[4885]: E1205 20:26:22.992946 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb70f635-5e81-4e4f-bce1-77298cfc9fab" containerName="nova-metadata-log" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.992968 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb70f635-5e81-4e4f-bce1-77298cfc9fab" containerName="nova-metadata-log" Dec 05 20:26:22 crc kubenswrapper[4885]: E1205 20:26:22.993012 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb70f635-5e81-4e4f-bce1-77298cfc9fab" containerName="nova-metadata-metadata" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.993123 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb70f635-5e81-4e4f-bce1-77298cfc9fab" containerName="nova-metadata-metadata" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.993354 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb70f635-5e81-4e4f-bce1-77298cfc9fab" containerName="nova-metadata-log" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.993403 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb70f635-5e81-4e4f-bce1-77298cfc9fab" containerName="nova-metadata-metadata" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.994710 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.997736 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 20:26:22 crc kubenswrapper[4885]: I1205 20:26:22.998034 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.002801 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.074428 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/970315d7-abaa-42c0-8632-de7757bff938-logs\") pod \"nova-metadata-0\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " pod="openstack/nova-metadata-0" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.074485 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-config-data\") pod \"nova-metadata-0\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " pod="openstack/nova-metadata-0" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.074527 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " pod="openstack/nova-metadata-0" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.074605 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh57x\" (UniqueName: \"kubernetes.io/projected/970315d7-abaa-42c0-8632-de7757bff938-kube-api-access-wh57x\") pod \"nova-metadata-0\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " pod="openstack/nova-metadata-0" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.074727 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " pod="openstack/nova-metadata-0" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.176867 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh57x\" (UniqueName: \"kubernetes.io/projected/970315d7-abaa-42c0-8632-de7757bff938-kube-api-access-wh57x\") pod \"nova-metadata-0\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " pod="openstack/nova-metadata-0" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.176965 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " pod="openstack/nova-metadata-0" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.177085 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/970315d7-abaa-42c0-8632-de7757bff938-logs\") pod \"nova-metadata-0\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " pod="openstack/nova-metadata-0" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.177138 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-config-data\") pod \"nova-metadata-0\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " pod="openstack/nova-metadata-0" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.177190 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " pod="openstack/nova-metadata-0" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.177455 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/970315d7-abaa-42c0-8632-de7757bff938-logs\") pod \"nova-metadata-0\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " pod="openstack/nova-metadata-0" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.181238 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-config-data\") pod \"nova-metadata-0\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " pod="openstack/nova-metadata-0" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.181910 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " pod="openstack/nova-metadata-0" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.183395 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb70f635-5e81-4e4f-bce1-77298cfc9fab" path="/var/lib/kubelet/pods/eb70f635-5e81-4e4f-bce1-77298cfc9fab/volumes" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.184756 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " pod="openstack/nova-metadata-0" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.196369 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh57x\" (UniqueName: \"kubernetes.io/projected/970315d7-abaa-42c0-8632-de7757bff938-kube-api-access-wh57x\") pod \"nova-metadata-0\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " pod="openstack/nova-metadata-0" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.312238 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.832975 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:26:23 crc kubenswrapper[4885]: I1205 20:26:23.882900 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"970315d7-abaa-42c0-8632-de7757bff938","Type":"ContainerStarted","Data":"20749950d47ca6f963257ee5d1c5da5c8b04cbeaf25d67ed036b5cc6bfe58929"} Dec 05 20:26:24 crc kubenswrapper[4885]: I1205 20:26:24.899928 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"970315d7-abaa-42c0-8632-de7757bff938","Type":"ContainerStarted","Data":"17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da"} Dec 05 20:26:24 crc kubenswrapper[4885]: I1205 20:26:24.900221 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"970315d7-abaa-42c0-8632-de7757bff938","Type":"ContainerStarted","Data":"96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296"} Dec 05 20:26:24 crc kubenswrapper[4885]: I1205 20:26:24.905282 4885 generic.go:334] "Generic (PLEG): container finished" podID="e31833f3-c584-4352-bf8c-03e18def1ea2" containerID="aed89c1191b75d1079c9c31b64b659a5796e3a63edfdbe2b5e6f4b582c388024" exitCode=0 Dec 05 20:26:24 crc kubenswrapper[4885]: I1205 20:26:24.905358 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9qst6" event={"ID":"e31833f3-c584-4352-bf8c-03e18def1ea2","Type":"ContainerDied","Data":"aed89c1191b75d1079c9c31b64b659a5796e3a63edfdbe2b5e6f4b582c388024"} Dec 05 20:26:24 crc kubenswrapper[4885]: I1205 20:26:24.908545 4885 generic.go:334] "Generic (PLEG): container finished" podID="ad8c8f0f-88d1-4be1-8db0-882fac969fce" containerID="de1fc11cb66d09131a6dc49a451f0dbe994f12c1e351512c7a70089e2b3da346" exitCode=0 Dec 05 20:26:24 crc kubenswrapper[4885]: I1205 20:26:24.908592 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bv27r" event={"ID":"ad8c8f0f-88d1-4be1-8db0-882fac969fce","Type":"ContainerDied","Data":"de1fc11cb66d09131a6dc49a451f0dbe994f12c1e351512c7a70089e2b3da346"} Dec 05 20:26:24 crc kubenswrapper[4885]: I1205 20:26:24.942979 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.942957639 podStartE2EDuration="2.942957639s" podCreationTimestamp="2025-12-05 20:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:26:24.923865802 +0000 UTC m=+1250.220681483" watchObservedRunningTime="2025-12-05 20:26:24.942957639 +0000 UTC m=+1250.239773300" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.412785 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9qst6" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.419587 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bv27r" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.459709 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.490419 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.544394 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-config-data\") pod \"e31833f3-c584-4352-bf8c-03e18def1ea2\" (UID: \"e31833f3-c584-4352-bf8c-03e18def1ea2\") " Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.544988 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-combined-ca-bundle\") pod \"e31833f3-c584-4352-bf8c-03e18def1ea2\" (UID: \"e31833f3-c584-4352-bf8c-03e18def1ea2\") " Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.545089 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-config-data\") pod \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\" (UID: \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\") " Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.545175 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-scripts\") pod \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\" (UID: \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\") " Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.545239 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-scripts\") pod \"e31833f3-c584-4352-bf8c-03e18def1ea2\" (UID: \"e31833f3-c584-4352-bf8c-03e18def1ea2\") " Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.545305 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-combined-ca-bundle\") pod \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\" (UID: \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\") " Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.545331 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnk6j\" (UniqueName: \"kubernetes.io/projected/e31833f3-c584-4352-bf8c-03e18def1ea2-kube-api-access-vnk6j\") pod \"e31833f3-c584-4352-bf8c-03e18def1ea2\" (UID: \"e31833f3-c584-4352-bf8c-03e18def1ea2\") " Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.545357 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2csf\" (UniqueName: \"kubernetes.io/projected/ad8c8f0f-88d1-4be1-8db0-882fac969fce-kube-api-access-g2csf\") pod \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\" (UID: \"ad8c8f0f-88d1-4be1-8db0-882fac969fce\") " Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.550444 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-scripts" (OuterVolumeSpecName: "scripts") pod "ad8c8f0f-88d1-4be1-8db0-882fac969fce" (UID: "ad8c8f0f-88d1-4be1-8db0-882fac969fce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.551254 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-scripts" (OuterVolumeSpecName: "scripts") pod "e31833f3-c584-4352-bf8c-03e18def1ea2" (UID: "e31833f3-c584-4352-bf8c-03e18def1ea2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.551257 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad8c8f0f-88d1-4be1-8db0-882fac969fce-kube-api-access-g2csf" (OuterVolumeSpecName: "kube-api-access-g2csf") pod "ad8c8f0f-88d1-4be1-8db0-882fac969fce" (UID: "ad8c8f0f-88d1-4be1-8db0-882fac969fce"). InnerVolumeSpecName "kube-api-access-g2csf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.551834 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e31833f3-c584-4352-bf8c-03e18def1ea2-kube-api-access-vnk6j" (OuterVolumeSpecName: "kube-api-access-vnk6j") pod "e31833f3-c584-4352-bf8c-03e18def1ea2" (UID: "e31833f3-c584-4352-bf8c-03e18def1ea2"). InnerVolumeSpecName "kube-api-access-vnk6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.573632 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-config-data" (OuterVolumeSpecName: "config-data") pod "e31833f3-c584-4352-bf8c-03e18def1ea2" (UID: "e31833f3-c584-4352-bf8c-03e18def1ea2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.577152 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-config-data" (OuterVolumeSpecName: "config-data") pod "ad8c8f0f-88d1-4be1-8db0-882fac969fce" (UID: "ad8c8f0f-88d1-4be1-8db0-882fac969fce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.578721 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad8c8f0f-88d1-4be1-8db0-882fac969fce" (UID: "ad8c8f0f-88d1-4be1-8db0-882fac969fce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.582172 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e31833f3-c584-4352-bf8c-03e18def1ea2" (UID: "e31833f3-c584-4352-bf8c-03e18def1ea2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.583446 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.645548 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-894d58c65-zbm4r"] Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.645878 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-894d58c65-zbm4r" podUID="6a656ed8-8495-40ab-a37e-10f50b7eb513" containerName="dnsmasq-dns" containerID="cri-o://16796d5140408d20de6741a93998d7b9fdf4c7bead09366b51b45581442e0242" gracePeriod=10 Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.657965 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.658000 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.658010 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.658030 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.658039 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e31833f3-c584-4352-bf8c-03e18def1ea2-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.658047 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad8c8f0f-88d1-4be1-8db0-882fac969fce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.658054 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnk6j\" (UniqueName: \"kubernetes.io/projected/e31833f3-c584-4352-bf8c-03e18def1ea2-kube-api-access-vnk6j\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.658063 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2csf\" (UniqueName: \"kubernetes.io/projected/ad8c8f0f-88d1-4be1-8db0-882fac969fce-kube-api-access-g2csf\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.661443 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.662312 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.935056 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bv27r" event={"ID":"ad8c8f0f-88d1-4be1-8db0-882fac969fce","Type":"ContainerDied","Data":"bde1092c5bc1bbf2bdeb60f530d644dc177daa70097dff94c89e34dfae6a21b5"} Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.935085 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bv27r" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.935099 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bde1092c5bc1bbf2bdeb60f530d644dc177daa70097dff94c89e34dfae6a21b5" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.936879 4885 generic.go:334] "Generic (PLEG): container finished" podID="6a656ed8-8495-40ab-a37e-10f50b7eb513" containerID="16796d5140408d20de6741a93998d7b9fdf4c7bead09366b51b45581442e0242" exitCode=0 Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.936939 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-894d58c65-zbm4r" event={"ID":"6a656ed8-8495-40ab-a37e-10f50b7eb513","Type":"ContainerDied","Data":"16796d5140408d20de6741a93998d7b9fdf4c7bead09366b51b45581442e0242"} Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.939928 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9qst6" event={"ID":"e31833f3-c584-4352-bf8c-03e18def1ea2","Type":"ContainerDied","Data":"bf5231efd380da7fffcd79f5b25461e77358ca00e234c9151cc942d210375123"} Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.939952 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf5231efd380da7fffcd79f5b25461e77358ca00e234c9151cc942d210375123" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.940002 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9qst6" Dec 05 20:26:26 crc kubenswrapper[4885]: I1205 20:26:26.998038 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.094306 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 20:26:27 crc kubenswrapper[4885]: E1205 20:26:27.094677 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e31833f3-c584-4352-bf8c-03e18def1ea2" containerName="nova-manage" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.094687 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e31833f3-c584-4352-bf8c-03e18def1ea2" containerName="nova-manage" Dec 05 20:26:27 crc kubenswrapper[4885]: E1205 20:26:27.094703 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad8c8f0f-88d1-4be1-8db0-882fac969fce" containerName="nova-cell1-conductor-db-sync" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.094709 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad8c8f0f-88d1-4be1-8db0-882fac969fce" containerName="nova-cell1-conductor-db-sync" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.094877 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e31833f3-c584-4352-bf8c-03e18def1ea2" containerName="nova-manage" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.094891 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad8c8f0f-88d1-4be1-8db0-882fac969fce" containerName="nova-cell1-conductor-db-sync" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.095468 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.095538 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.110236 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.140775 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.170736 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7ef835-7090-43c0-b489-8e1adc41fd47-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0d7ef835-7090-43c0-b489-8e1adc41fd47\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.170883 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd7gx\" (UniqueName: \"kubernetes.io/projected/0d7ef835-7090-43c0-b489-8e1adc41fd47-kube-api-access-cd7gx\") pod \"nova-cell1-conductor-0\" (UID: \"0d7ef835-7090-43c0-b489-8e1adc41fd47\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.170911 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7ef835-7090-43c0-b489-8e1adc41fd47-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0d7ef835-7090-43c0-b489-8e1adc41fd47\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.230707 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.248083 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.248300 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="970315d7-abaa-42c0-8632-de7757bff938" containerName="nova-metadata-log" containerID="cri-o://96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296" gracePeriod=30 Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.248382 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="970315d7-abaa-42c0-8632-de7757bff938" containerName="nova-metadata-metadata" containerID="cri-o://17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da" gracePeriod=30 Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.271586 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-ovsdbserver-nb\") pod \"6a656ed8-8495-40ab-a37e-10f50b7eb513\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.271649 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-ovsdbserver-sb\") pod \"6a656ed8-8495-40ab-a37e-10f50b7eb513\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.271688 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kggkq\" (UniqueName: \"kubernetes.io/projected/6a656ed8-8495-40ab-a37e-10f50b7eb513-kube-api-access-kggkq\") pod \"6a656ed8-8495-40ab-a37e-10f50b7eb513\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.271814 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-dns-swift-storage-0\") pod \"6a656ed8-8495-40ab-a37e-10f50b7eb513\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.271865 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-dns-svc\") pod \"6a656ed8-8495-40ab-a37e-10f50b7eb513\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.271918 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-config\") pod \"6a656ed8-8495-40ab-a37e-10f50b7eb513\" (UID: \"6a656ed8-8495-40ab-a37e-10f50b7eb513\") " Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.272111 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7ef835-7090-43c0-b489-8e1adc41fd47-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0d7ef835-7090-43c0-b489-8e1adc41fd47\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.272215 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd7gx\" (UniqueName: \"kubernetes.io/projected/0d7ef835-7090-43c0-b489-8e1adc41fd47-kube-api-access-cd7gx\") pod \"nova-cell1-conductor-0\" (UID: \"0d7ef835-7090-43c0-b489-8e1adc41fd47\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.272238 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7ef835-7090-43c0-b489-8e1adc41fd47-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0d7ef835-7090-43c0-b489-8e1adc41fd47\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.278993 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a656ed8-8495-40ab-a37e-10f50b7eb513-kube-api-access-kggkq" (OuterVolumeSpecName: "kube-api-access-kggkq") pod "6a656ed8-8495-40ab-a37e-10f50b7eb513" (UID: "6a656ed8-8495-40ab-a37e-10f50b7eb513"). InnerVolumeSpecName "kube-api-access-kggkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.288743 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd7gx\" (UniqueName: \"kubernetes.io/projected/0d7ef835-7090-43c0-b489-8e1adc41fd47-kube-api-access-cd7gx\") pod \"nova-cell1-conductor-0\" (UID: \"0d7ef835-7090-43c0-b489-8e1adc41fd47\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.288784 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7ef835-7090-43c0-b489-8e1adc41fd47-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0d7ef835-7090-43c0-b489-8e1adc41fd47\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.291187 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7ef835-7090-43c0-b489-8e1adc41fd47-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0d7ef835-7090-43c0-b489-8e1adc41fd47\") " pod="openstack/nova-cell1-conductor-0" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.336101 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6a656ed8-8495-40ab-a37e-10f50b7eb513" (UID: "6a656ed8-8495-40ab-a37e-10f50b7eb513"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.344515 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6a656ed8-8495-40ab-a37e-10f50b7eb513" (UID: "6a656ed8-8495-40ab-a37e-10f50b7eb513"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.345565 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6a656ed8-8495-40ab-a37e-10f50b7eb513" (UID: "6a656ed8-8495-40ab-a37e-10f50b7eb513"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.347509 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-config" (OuterVolumeSpecName: "config") pod "6a656ed8-8495-40ab-a37e-10f50b7eb513" (UID: "6a656ed8-8495-40ab-a37e-10f50b7eb513"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.347940 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6a656ed8-8495-40ab-a37e-10f50b7eb513" (UID: "6a656ed8-8495-40ab-a37e-10f50b7eb513"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.375141 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.375170 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.375181 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.375465 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.375484 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a656ed8-8495-40ab-a37e-10f50b7eb513-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.375494 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kggkq\" (UniqueName: \"kubernetes.io/projected/6a656ed8-8495-40ab-a37e-10f50b7eb513-kube-api-access-kggkq\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.440564 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.546744 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.745219 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b7c6473f-6f68-4841-96a6-cb9511da550e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.745367 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b7c6473f-6f68-4841-96a6-cb9511da550e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.803786 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.885037 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-nova-metadata-tls-certs\") pod \"970315d7-abaa-42c0-8632-de7757bff938\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.885126 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-combined-ca-bundle\") pod \"970315d7-abaa-42c0-8632-de7757bff938\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.885219 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-config-data\") pod \"970315d7-abaa-42c0-8632-de7757bff938\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.885292 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/970315d7-abaa-42c0-8632-de7757bff938-logs\") pod \"970315d7-abaa-42c0-8632-de7757bff938\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.885329 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh57x\" (UniqueName: \"kubernetes.io/projected/970315d7-abaa-42c0-8632-de7757bff938-kube-api-access-wh57x\") pod \"970315d7-abaa-42c0-8632-de7757bff938\" (UID: \"970315d7-abaa-42c0-8632-de7757bff938\") " Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.886771 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/970315d7-abaa-42c0-8632-de7757bff938-logs" (OuterVolumeSpecName: "logs") pod "970315d7-abaa-42c0-8632-de7757bff938" (UID: "970315d7-abaa-42c0-8632-de7757bff938"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.889559 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/970315d7-abaa-42c0-8632-de7757bff938-kube-api-access-wh57x" (OuterVolumeSpecName: "kube-api-access-wh57x") pod "970315d7-abaa-42c0-8632-de7757bff938" (UID: "970315d7-abaa-42c0-8632-de7757bff938"). InnerVolumeSpecName "kube-api-access-wh57x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.910545 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-config-data" (OuterVolumeSpecName: "config-data") pod "970315d7-abaa-42c0-8632-de7757bff938" (UID: "970315d7-abaa-42c0-8632-de7757bff938"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.918842 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "970315d7-abaa-42c0-8632-de7757bff938" (UID: "970315d7-abaa-42c0-8632-de7757bff938"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.943340 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "970315d7-abaa-42c0-8632-de7757bff938" (UID: "970315d7-abaa-42c0-8632-de7757bff938"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.960304 4885 generic.go:334] "Generic (PLEG): container finished" podID="970315d7-abaa-42c0-8632-de7757bff938" containerID="17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da" exitCode=0 Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.960337 4885 generic.go:334] "Generic (PLEG): container finished" podID="970315d7-abaa-42c0-8632-de7757bff938" containerID="96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296" exitCode=143 Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.960378 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"970315d7-abaa-42c0-8632-de7757bff938","Type":"ContainerDied","Data":"17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da"} Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.960410 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"970315d7-abaa-42c0-8632-de7757bff938","Type":"ContainerDied","Data":"96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296"} Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.960421 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"970315d7-abaa-42c0-8632-de7757bff938","Type":"ContainerDied","Data":"20749950d47ca6f963257ee5d1c5da5c8b04cbeaf25d67ed036b5cc6bfe58929"} Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.960438 4885 scope.go:117] "RemoveContainer" containerID="17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.960579 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.966869 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b7c6473f-6f68-4841-96a6-cb9511da550e" containerName="nova-api-log" containerID="cri-o://106577e7cb6fb07faf89144149a2c61171649fddb80838e27da00a1cea0f760d" gracePeriod=30 Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.967179 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-894d58c65-zbm4r" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.967674 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-894d58c65-zbm4r" event={"ID":"6a656ed8-8495-40ab-a37e-10f50b7eb513","Type":"ContainerDied","Data":"da87d526edf9a4c927ed6b15fb59c60a33d065322e85e05cd4b73cd4f21a1d5e"} Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.968817 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b7c6473f-6f68-4841-96a6-cb9511da550e" containerName="nova-api-api" containerID="cri-o://3b394c30ab9327b069f03fa539fa065b66dec6d6ed18fcd56a46cee57e52882a" gracePeriod=30 Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.987611 4885 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.987644 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.987655 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/970315d7-abaa-42c0-8632-de7757bff938-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.987667 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/970315d7-abaa-42c0-8632-de7757bff938-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:27 crc kubenswrapper[4885]: I1205 20:26:27.987678 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh57x\" (UniqueName: \"kubernetes.io/projected/970315d7-abaa-42c0-8632-de7757bff938-kube-api-access-wh57x\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.000172 4885 scope.go:117] "RemoveContainer" containerID="96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.014111 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.035846 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.050081 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-894d58c65-zbm4r"] Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.063716 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-894d58c65-zbm4r"] Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.068651 4885 scope.go:117] "RemoveContainer" containerID="17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.068810 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 20:26:28 crc kubenswrapper[4885]: E1205 20:26:28.069378 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da\": container with ID starting with 17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da not found: ID does not exist" containerID="17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.069463 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da"} err="failed to get container status \"17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da\": rpc error: code = NotFound desc = could not find container \"17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da\": container with ID starting with 17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da not found: ID does not exist" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.069536 4885 scope.go:117] "RemoveContainer" containerID="96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296" Dec 05 20:26:28 crc kubenswrapper[4885]: E1205 20:26:28.069883 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296\": container with ID starting with 96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296 not found: ID does not exist" containerID="96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.069970 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296"} err="failed to get container status \"96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296\": rpc error: code = NotFound desc = could not find container \"96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296\": container with ID starting with 96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296 not found: ID does not exist" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.070049 4885 scope.go:117] "RemoveContainer" containerID="17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.070276 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da"} err="failed to get container status \"17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da\": rpc error: code = NotFound desc = could not find container \"17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da\": container with ID starting with 17d98cb1e844882ef358aea26bd610980c9a5b71ba8c8acfb5ee507e269382da not found: ID does not exist" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.070348 4885 scope.go:117] "RemoveContainer" containerID="96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.070548 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296"} err="failed to get container status \"96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296\": rpc error: code = NotFound desc = could not find container \"96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296\": container with ID starting with 96ad0d57a5bb3378b6b4dee0fce9b110a4f49a43516de91c84ff536ba330e296 not found: ID does not exist" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.070623 4885 scope.go:117] "RemoveContainer" containerID="16796d5140408d20de6741a93998d7b9fdf4c7bead09366b51b45581442e0242" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.076489 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:26:28 crc kubenswrapper[4885]: E1205 20:26:28.076977 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="970315d7-abaa-42c0-8632-de7757bff938" containerName="nova-metadata-log" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.077111 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="970315d7-abaa-42c0-8632-de7757bff938" containerName="nova-metadata-log" Dec 05 20:26:28 crc kubenswrapper[4885]: E1205 20:26:28.077198 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a656ed8-8495-40ab-a37e-10f50b7eb513" containerName="dnsmasq-dns" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.077275 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a656ed8-8495-40ab-a37e-10f50b7eb513" containerName="dnsmasq-dns" Dec 05 20:26:28 crc kubenswrapper[4885]: E1205 20:26:28.077349 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a656ed8-8495-40ab-a37e-10f50b7eb513" containerName="init" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.077403 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a656ed8-8495-40ab-a37e-10f50b7eb513" containerName="init" Dec 05 20:26:28 crc kubenswrapper[4885]: E1205 20:26:28.077461 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="970315d7-abaa-42c0-8632-de7757bff938" containerName="nova-metadata-metadata" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.077521 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="970315d7-abaa-42c0-8632-de7757bff938" containerName="nova-metadata-metadata" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.077808 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="970315d7-abaa-42c0-8632-de7757bff938" containerName="nova-metadata-log" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.077912 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="970315d7-abaa-42c0-8632-de7757bff938" containerName="nova-metadata-metadata" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.077981 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a656ed8-8495-40ab-a37e-10f50b7eb513" containerName="dnsmasq-dns" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.080424 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: W1205 20:26:28.081685 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d7ef835_7090_43c0_b489_8e1adc41fd47.slice/crio-b2ce34a3b29aa5243ce7917efd439e2d287846d523590f7199be8d62e805a5e3 WatchSource:0}: Error finding container b2ce34a3b29aa5243ce7917efd439e2d287846d523590f7199be8d62e805a5e3: Status 404 returned error can't find the container with id b2ce34a3b29aa5243ce7917efd439e2d287846d523590f7199be8d62e805a5e3 Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.084189 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.084338 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.084907 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.104004 4885 scope.go:117] "RemoveContainer" containerID="a61621e78d4ec6c77d02558e97f0a20f39d377bba59fc968c55f1fae1bc3ec50" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.191853 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbgjc\" (UniqueName: \"kubernetes.io/projected/20be5ba9-3fcb-446d-bec3-eaf96556d805-kube-api-access-xbgjc\") pod \"nova-metadata-0\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.192236 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.192264 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.192343 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20be5ba9-3fcb-446d-bec3-eaf96556d805-logs\") pod \"nova-metadata-0\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.192433 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-config-data\") pod \"nova-metadata-0\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.293799 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.293838 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20be5ba9-3fcb-446d-bec3-eaf96556d805-logs\") pod \"nova-metadata-0\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.293924 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-config-data\") pod \"nova-metadata-0\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.293988 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbgjc\" (UniqueName: \"kubernetes.io/projected/20be5ba9-3fcb-446d-bec3-eaf96556d805-kube-api-access-xbgjc\") pod \"nova-metadata-0\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.294116 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.295002 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20be5ba9-3fcb-446d-bec3-eaf96556d805-logs\") pod \"nova-metadata-0\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.300172 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.306772 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-config-data\") pod \"nova-metadata-0\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.311672 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.312083 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbgjc\" (UniqueName: \"kubernetes.io/projected/20be5ba9-3fcb-446d-bec3-eaf96556d805-kube-api-access-xbgjc\") pod \"nova-metadata-0\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.407690 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.902071 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.986983 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20be5ba9-3fcb-446d-bec3-eaf96556d805","Type":"ContainerStarted","Data":"68f4e04e2440b14cd97a5d011069faef75d2616b41148c472ae1951794dd2b46"} Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.989240 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0d7ef835-7090-43c0-b489-8e1adc41fd47","Type":"ContainerStarted","Data":"b531fe515c87bce036c29004d0246c96ac40dc5d6949c6e34a9052fd96af5d82"} Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.989280 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0d7ef835-7090-43c0-b489-8e1adc41fd47","Type":"ContainerStarted","Data":"b2ce34a3b29aa5243ce7917efd439e2d287846d523590f7199be8d62e805a5e3"} Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.989791 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.994052 4885 generic.go:334] "Generic (PLEG): container finished" podID="b7c6473f-6f68-4841-96a6-cb9511da550e" containerID="106577e7cb6fb07faf89144149a2c61171649fddb80838e27da00a1cea0f760d" exitCode=143 Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.994224 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1770387e-755d-445c-be41-29d372f71ba7" containerName="nova-scheduler-scheduler" containerID="cri-o://d5722fc8981da07faeb9b883282265a7a996b90159023f5d3a9d819066ca05ea" gracePeriod=30 Dec 05 20:26:28 crc kubenswrapper[4885]: I1205 20:26:28.994460 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7c6473f-6f68-4841-96a6-cb9511da550e","Type":"ContainerDied","Data":"106577e7cb6fb07faf89144149a2c61171649fddb80838e27da00a1cea0f760d"} Dec 05 20:26:29 crc kubenswrapper[4885]: I1205 20:26:29.011863 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.011843597 podStartE2EDuration="2.011843597s" podCreationTimestamp="2025-12-05 20:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:26:29.002092937 +0000 UTC m=+1254.298908598" watchObservedRunningTime="2025-12-05 20:26:29.011843597 +0000 UTC m=+1254.308659258" Dec 05 20:26:29 crc kubenswrapper[4885]: I1205 20:26:29.192743 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a656ed8-8495-40ab-a37e-10f50b7eb513" path="/var/lib/kubelet/pods/6a656ed8-8495-40ab-a37e-10f50b7eb513/volumes" Dec 05 20:26:29 crc kubenswrapper[4885]: I1205 20:26:29.193438 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="970315d7-abaa-42c0-8632-de7757bff938" path="/var/lib/kubelet/pods/970315d7-abaa-42c0-8632-de7757bff938/volumes" Dec 05 20:26:30 crc kubenswrapper[4885]: I1205 20:26:30.028966 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20be5ba9-3fcb-446d-bec3-eaf96556d805","Type":"ContainerStarted","Data":"fd485a8695d5a55319bbcd91e05f32caed1f175a66d999a7ab4dca5e6d0552f4"} Dec 05 20:26:30 crc kubenswrapper[4885]: I1205 20:26:30.029419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20be5ba9-3fcb-446d-bec3-eaf96556d805","Type":"ContainerStarted","Data":"033add53f2a15716fe27fb06e12fdbb503d21d3dbe93e1ab0443a11dbe23c319"} Dec 05 20:26:30 crc kubenswrapper[4885]: I1205 20:26:30.063779 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.063753005 podStartE2EDuration="2.063753005s" podCreationTimestamp="2025-12-05 20:26:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:26:30.06359999 +0000 UTC m=+1255.360415691" watchObservedRunningTime="2025-12-05 20:26:30.063753005 +0000 UTC m=+1255.360568706" Dec 05 20:26:31 crc kubenswrapper[4885]: E1205 20:26:31.463185 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5722fc8981da07faeb9b883282265a7a996b90159023f5d3a9d819066ca05ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 20:26:31 crc kubenswrapper[4885]: E1205 20:26:31.464658 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5722fc8981da07faeb9b883282265a7a996b90159023f5d3a9d819066ca05ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 20:26:31 crc kubenswrapper[4885]: E1205 20:26:31.466444 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5722fc8981da07faeb9b883282265a7a996b90159023f5d3a9d819066ca05ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 20:26:31 crc kubenswrapper[4885]: E1205 20:26:31.466626 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="1770387e-755d-445c-be41-29d372f71ba7" containerName="nova-scheduler-scheduler" Dec 05 20:26:32 crc kubenswrapper[4885]: I1205 20:26:32.055802 4885 generic.go:334] "Generic (PLEG): container finished" podID="1770387e-755d-445c-be41-29d372f71ba7" containerID="d5722fc8981da07faeb9b883282265a7a996b90159023f5d3a9d819066ca05ea" exitCode=0 Dec 05 20:26:32 crc kubenswrapper[4885]: I1205 20:26:32.055982 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1770387e-755d-445c-be41-29d372f71ba7","Type":"ContainerDied","Data":"d5722fc8981da07faeb9b883282265a7a996b90159023f5d3a9d819066ca05ea"} Dec 05 20:26:32 crc kubenswrapper[4885]: I1205 20:26:32.404583 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:26:32 crc kubenswrapper[4885]: I1205 20:26:32.482986 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1770387e-755d-445c-be41-29d372f71ba7-config-data\") pod \"1770387e-755d-445c-be41-29d372f71ba7\" (UID: \"1770387e-755d-445c-be41-29d372f71ba7\") " Dec 05 20:26:32 crc kubenswrapper[4885]: I1205 20:26:32.483083 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p49dj\" (UniqueName: \"kubernetes.io/projected/1770387e-755d-445c-be41-29d372f71ba7-kube-api-access-p49dj\") pod \"1770387e-755d-445c-be41-29d372f71ba7\" (UID: \"1770387e-755d-445c-be41-29d372f71ba7\") " Dec 05 20:26:32 crc kubenswrapper[4885]: I1205 20:26:32.483163 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1770387e-755d-445c-be41-29d372f71ba7-combined-ca-bundle\") pod \"1770387e-755d-445c-be41-29d372f71ba7\" (UID: \"1770387e-755d-445c-be41-29d372f71ba7\") " Dec 05 20:26:32 crc kubenswrapper[4885]: I1205 20:26:32.490335 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1770387e-755d-445c-be41-29d372f71ba7-kube-api-access-p49dj" (OuterVolumeSpecName: "kube-api-access-p49dj") pod "1770387e-755d-445c-be41-29d372f71ba7" (UID: "1770387e-755d-445c-be41-29d372f71ba7"). InnerVolumeSpecName "kube-api-access-p49dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:32 crc kubenswrapper[4885]: I1205 20:26:32.509386 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1770387e-755d-445c-be41-29d372f71ba7-config-data" (OuterVolumeSpecName: "config-data") pod "1770387e-755d-445c-be41-29d372f71ba7" (UID: "1770387e-755d-445c-be41-29d372f71ba7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:32 crc kubenswrapper[4885]: I1205 20:26:32.528235 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1770387e-755d-445c-be41-29d372f71ba7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1770387e-755d-445c-be41-29d372f71ba7" (UID: "1770387e-755d-445c-be41-29d372f71ba7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:32 crc kubenswrapper[4885]: I1205 20:26:32.585164 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1770387e-755d-445c-be41-29d372f71ba7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:32 crc kubenswrapper[4885]: I1205 20:26:32.585712 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1770387e-755d-445c-be41-29d372f71ba7-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:32 crc kubenswrapper[4885]: I1205 20:26:32.585797 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p49dj\" (UniqueName: \"kubernetes.io/projected/1770387e-755d-445c-be41-29d372f71ba7-kube-api-access-p49dj\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.068539 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1770387e-755d-445c-be41-29d372f71ba7","Type":"ContainerDied","Data":"42f241f6ce0072dbbcd74c74db7426c1b30dc653d63e9aa8bf7523f8287153d6"} Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.068613 4885 scope.go:117] "RemoveContainer" containerID="d5722fc8981da07faeb9b883282265a7a996b90159023f5d3a9d819066ca05ea" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.068613 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.111194 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.123135 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.133546 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:26:33 crc kubenswrapper[4885]: E1205 20:26:33.133934 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1770387e-755d-445c-be41-29d372f71ba7" containerName="nova-scheduler-scheduler" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.133953 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1770387e-755d-445c-be41-29d372f71ba7" containerName="nova-scheduler-scheduler" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.134160 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1770387e-755d-445c-be41-29d372f71ba7" containerName="nova-scheduler-scheduler" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.134731 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.137689 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.144620 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.188307 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1770387e-755d-445c-be41-29d372f71ba7" path="/var/lib/kubelet/pods/1770387e-755d-445c-be41-29d372f71ba7/volumes" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.195959 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km2bp\" (UniqueName: \"kubernetes.io/projected/0885bbfa-d44b-4e51-948b-8089bbb49c7b-kube-api-access-km2bp\") pod \"nova-scheduler-0\" (UID: \"0885bbfa-d44b-4e51-948b-8089bbb49c7b\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.196033 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0885bbfa-d44b-4e51-948b-8089bbb49c7b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0885bbfa-d44b-4e51-948b-8089bbb49c7b\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.196064 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0885bbfa-d44b-4e51-948b-8089bbb49c7b-config-data\") pod \"nova-scheduler-0\" (UID: \"0885bbfa-d44b-4e51-948b-8089bbb49c7b\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.297384 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km2bp\" (UniqueName: \"kubernetes.io/projected/0885bbfa-d44b-4e51-948b-8089bbb49c7b-kube-api-access-km2bp\") pod \"nova-scheduler-0\" (UID: \"0885bbfa-d44b-4e51-948b-8089bbb49c7b\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.297459 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0885bbfa-d44b-4e51-948b-8089bbb49c7b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0885bbfa-d44b-4e51-948b-8089bbb49c7b\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.297485 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0885bbfa-d44b-4e51-948b-8089bbb49c7b-config-data\") pod \"nova-scheduler-0\" (UID: \"0885bbfa-d44b-4e51-948b-8089bbb49c7b\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.301414 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0885bbfa-d44b-4e51-948b-8089bbb49c7b-config-data\") pod \"nova-scheduler-0\" (UID: \"0885bbfa-d44b-4e51-948b-8089bbb49c7b\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.301747 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0885bbfa-d44b-4e51-948b-8089bbb49c7b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0885bbfa-d44b-4e51-948b-8089bbb49c7b\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.313502 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km2bp\" (UniqueName: \"kubernetes.io/projected/0885bbfa-d44b-4e51-948b-8089bbb49c7b-kube-api-access-km2bp\") pod \"nova-scheduler-0\" (UID: \"0885bbfa-d44b-4e51-948b-8089bbb49c7b\") " pod="openstack/nova-scheduler-0" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.408547 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.408579 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.456937 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.758321 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.912662 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c6473f-6f68-4841-96a6-cb9511da550e-combined-ca-bundle\") pod \"b7c6473f-6f68-4841-96a6-cb9511da550e\" (UID: \"b7c6473f-6f68-4841-96a6-cb9511da550e\") " Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.912856 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7c6473f-6f68-4841-96a6-cb9511da550e-config-data\") pod \"b7c6473f-6f68-4841-96a6-cb9511da550e\" (UID: \"b7c6473f-6f68-4841-96a6-cb9511da550e\") " Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.912964 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljr8s\" (UniqueName: \"kubernetes.io/projected/b7c6473f-6f68-4841-96a6-cb9511da550e-kube-api-access-ljr8s\") pod \"b7c6473f-6f68-4841-96a6-cb9511da550e\" (UID: \"b7c6473f-6f68-4841-96a6-cb9511da550e\") " Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.913056 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7c6473f-6f68-4841-96a6-cb9511da550e-logs\") pod \"b7c6473f-6f68-4841-96a6-cb9511da550e\" (UID: \"b7c6473f-6f68-4841-96a6-cb9511da550e\") " Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.913954 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7c6473f-6f68-4841-96a6-cb9511da550e-logs" (OuterVolumeSpecName: "logs") pod "b7c6473f-6f68-4841-96a6-cb9511da550e" (UID: "b7c6473f-6f68-4841-96a6-cb9511da550e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.914278 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7c6473f-6f68-4841-96a6-cb9511da550e-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.920104 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7c6473f-6f68-4841-96a6-cb9511da550e-kube-api-access-ljr8s" (OuterVolumeSpecName: "kube-api-access-ljr8s") pod "b7c6473f-6f68-4841-96a6-cb9511da550e" (UID: "b7c6473f-6f68-4841-96a6-cb9511da550e"). InnerVolumeSpecName "kube-api-access-ljr8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.948448 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7c6473f-6f68-4841-96a6-cb9511da550e-config-data" (OuterVolumeSpecName: "config-data") pod "b7c6473f-6f68-4841-96a6-cb9511da550e" (UID: "b7c6473f-6f68-4841-96a6-cb9511da550e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.970769 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7c6473f-6f68-4841-96a6-cb9511da550e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7c6473f-6f68-4841-96a6-cb9511da550e" (UID: "b7c6473f-6f68-4841-96a6-cb9511da550e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:33 crc kubenswrapper[4885]: I1205 20:26:33.980512 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:26:33 crc kubenswrapper[4885]: W1205 20:26:33.981130 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0885bbfa_d44b_4e51_948b_8089bbb49c7b.slice/crio-226143449e5b63b9104a8df10645ce988ba9f7dbbbf17814d417f4110ef55942 WatchSource:0}: Error finding container 226143449e5b63b9104a8df10645ce988ba9f7dbbbf17814d417f4110ef55942: Status 404 returned error can't find the container with id 226143449e5b63b9104a8df10645ce988ba9f7dbbbf17814d417f4110ef55942 Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.016249 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c6473f-6f68-4841-96a6-cb9511da550e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.016280 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7c6473f-6f68-4841-96a6-cb9511da550e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.016292 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljr8s\" (UniqueName: \"kubernetes.io/projected/b7c6473f-6f68-4841-96a6-cb9511da550e-kube-api-access-ljr8s\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.085268 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0885bbfa-d44b-4e51-948b-8089bbb49c7b","Type":"ContainerStarted","Data":"226143449e5b63b9104a8df10645ce988ba9f7dbbbf17814d417f4110ef55942"} Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.087342 4885 generic.go:334] "Generic (PLEG): container finished" podID="b7c6473f-6f68-4841-96a6-cb9511da550e" containerID="3b394c30ab9327b069f03fa539fa065b66dec6d6ed18fcd56a46cee57e52882a" exitCode=0 Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.087434 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7c6473f-6f68-4841-96a6-cb9511da550e","Type":"ContainerDied","Data":"3b394c30ab9327b069f03fa539fa065b66dec6d6ed18fcd56a46cee57e52882a"} Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.087452 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.087478 4885 scope.go:117] "RemoveContainer" containerID="3b394c30ab9327b069f03fa539fa065b66dec6d6ed18fcd56a46cee57e52882a" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.087464 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7c6473f-6f68-4841-96a6-cb9511da550e","Type":"ContainerDied","Data":"6feb9281d57cfc6b7a737a95e353f9b9a0febb4e49ed6151662e5c1f1946f6c0"} Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.132685 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.134104 4885 scope.go:117] "RemoveContainer" containerID="106577e7cb6fb07faf89144149a2c61171649fddb80838e27da00a1cea0f760d" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.149634 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.157893 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 20:26:34 crc kubenswrapper[4885]: E1205 20:26:34.160634 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7c6473f-6f68-4841-96a6-cb9511da550e" containerName="nova-api-api" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.160713 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7c6473f-6f68-4841-96a6-cb9511da550e" containerName="nova-api-api" Dec 05 20:26:34 crc kubenswrapper[4885]: E1205 20:26:34.160807 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7c6473f-6f68-4841-96a6-cb9511da550e" containerName="nova-api-log" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.160867 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7c6473f-6f68-4841-96a6-cb9511da550e" containerName="nova-api-log" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.161140 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7c6473f-6f68-4841-96a6-cb9511da550e" containerName="nova-api-api" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.161231 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7c6473f-6f68-4841-96a6-cb9511da550e" containerName="nova-api-log" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.162368 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.164943 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.168609 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.184891 4885 scope.go:117] "RemoveContainer" containerID="3b394c30ab9327b069f03fa539fa065b66dec6d6ed18fcd56a46cee57e52882a" Dec 05 20:26:34 crc kubenswrapper[4885]: E1205 20:26:34.187548 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b394c30ab9327b069f03fa539fa065b66dec6d6ed18fcd56a46cee57e52882a\": container with ID starting with 3b394c30ab9327b069f03fa539fa065b66dec6d6ed18fcd56a46cee57e52882a not found: ID does not exist" containerID="3b394c30ab9327b069f03fa539fa065b66dec6d6ed18fcd56a46cee57e52882a" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.187591 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b394c30ab9327b069f03fa539fa065b66dec6d6ed18fcd56a46cee57e52882a"} err="failed to get container status \"3b394c30ab9327b069f03fa539fa065b66dec6d6ed18fcd56a46cee57e52882a\": rpc error: code = NotFound desc = could not find container \"3b394c30ab9327b069f03fa539fa065b66dec6d6ed18fcd56a46cee57e52882a\": container with ID starting with 3b394c30ab9327b069f03fa539fa065b66dec6d6ed18fcd56a46cee57e52882a not found: ID does not exist" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.187615 4885 scope.go:117] "RemoveContainer" containerID="106577e7cb6fb07faf89144149a2c61171649fddb80838e27da00a1cea0f760d" Dec 05 20:26:34 crc kubenswrapper[4885]: E1205 20:26:34.188743 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"106577e7cb6fb07faf89144149a2c61171649fddb80838e27da00a1cea0f760d\": container with ID starting with 106577e7cb6fb07faf89144149a2c61171649fddb80838e27da00a1cea0f760d not found: ID does not exist" containerID="106577e7cb6fb07faf89144149a2c61171649fddb80838e27da00a1cea0f760d" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.188796 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"106577e7cb6fb07faf89144149a2c61171649fddb80838e27da00a1cea0f760d"} err="failed to get container status \"106577e7cb6fb07faf89144149a2c61171649fddb80838e27da00a1cea0f760d\": rpc error: code = NotFound desc = could not find container \"106577e7cb6fb07faf89144149a2c61171649fddb80838e27da00a1cea0f760d\": container with ID starting with 106577e7cb6fb07faf89144149a2c61171649fddb80838e27da00a1cea0f760d not found: ID does not exist" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.220406 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\") " pod="openstack/nova-api-0" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.220466 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-logs\") pod \"nova-api-0\" (UID: \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\") " pod="openstack/nova-api-0" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.220511 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-config-data\") pod \"nova-api-0\" (UID: \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\") " pod="openstack/nova-api-0" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.220547 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqhxb\" (UniqueName: \"kubernetes.io/projected/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-kube-api-access-rqhxb\") pod \"nova-api-0\" (UID: \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\") " pod="openstack/nova-api-0" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.322345 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\") " pod="openstack/nova-api-0" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.322414 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-logs\") pod \"nova-api-0\" (UID: \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\") " pod="openstack/nova-api-0" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.322450 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-config-data\") pod \"nova-api-0\" (UID: \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\") " pod="openstack/nova-api-0" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.322498 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqhxb\" (UniqueName: \"kubernetes.io/projected/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-kube-api-access-rqhxb\") pod \"nova-api-0\" (UID: \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\") " pod="openstack/nova-api-0" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.322900 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-logs\") pod \"nova-api-0\" (UID: \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\") " pod="openstack/nova-api-0" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.327477 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-config-data\") pod \"nova-api-0\" (UID: \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\") " pod="openstack/nova-api-0" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.328050 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\") " pod="openstack/nova-api-0" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.348121 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqhxb\" (UniqueName: \"kubernetes.io/projected/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-kube-api-access-rqhxb\") pod \"nova-api-0\" (UID: \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\") " pod="openstack/nova-api-0" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.489217 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:26:34 crc kubenswrapper[4885]: I1205 20:26:34.980371 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:26:34 crc kubenswrapper[4885]: W1205 20:26:34.986397 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeb8fa29_9ae0_4760_ac67_8e0fbe016294.slice/crio-a1d713781c909135a25e7ba43029f43bce287c264b64430349dffbc96bde02ea WatchSource:0}: Error finding container a1d713781c909135a25e7ba43029f43bce287c264b64430349dffbc96bde02ea: Status 404 returned error can't find the container with id a1d713781c909135a25e7ba43029f43bce287c264b64430349dffbc96bde02ea Dec 05 20:26:35 crc kubenswrapper[4885]: I1205 20:26:35.102769 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aeb8fa29-9ae0-4760-ac67-8e0fbe016294","Type":"ContainerStarted","Data":"a1d713781c909135a25e7ba43029f43bce287c264b64430349dffbc96bde02ea"} Dec 05 20:26:35 crc kubenswrapper[4885]: I1205 20:26:35.104410 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0885bbfa-d44b-4e51-948b-8089bbb49c7b","Type":"ContainerStarted","Data":"4ef80d3f7dab56a228bb64695c7c7c618c891820b73c90c741faf5395897ff26"} Dec 05 20:26:35 crc kubenswrapper[4885]: I1205 20:26:35.131285 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.131266373 podStartE2EDuration="2.131266373s" podCreationTimestamp="2025-12-05 20:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:26:35.119351267 +0000 UTC m=+1260.416166928" watchObservedRunningTime="2025-12-05 20:26:35.131266373 +0000 UTC m=+1260.428082034" Dec 05 20:26:35 crc kubenswrapper[4885]: I1205 20:26:35.187075 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7c6473f-6f68-4841-96a6-cb9511da550e" path="/var/lib/kubelet/pods/b7c6473f-6f68-4841-96a6-cb9511da550e/volumes" Dec 05 20:26:36 crc kubenswrapper[4885]: I1205 20:26:36.123162 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aeb8fa29-9ae0-4760-ac67-8e0fbe016294","Type":"ContainerStarted","Data":"334a467ce2169f64c8ba145459e94f01e5bb7ae530287742a7e6b582db75f856"} Dec 05 20:26:36 crc kubenswrapper[4885]: I1205 20:26:36.123662 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aeb8fa29-9ae0-4760-ac67-8e0fbe016294","Type":"ContainerStarted","Data":"7e0a0eb83667b5d9088bed550fd25b4607613a37a847116ab14a52881eabba87"} Dec 05 20:26:36 crc kubenswrapper[4885]: I1205 20:26:36.155644 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.155621784 podStartE2EDuration="2.155621784s" podCreationTimestamp="2025-12-05 20:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:26:36.148894448 +0000 UTC m=+1261.445710149" watchObservedRunningTime="2025-12-05 20:26:36.155621784 +0000 UTC m=+1261.452437455" Dec 05 20:26:37 crc kubenswrapper[4885]: I1205 20:26:37.496221 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 05 20:26:38 crc kubenswrapper[4885]: I1205 20:26:38.316859 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 20:26:38 crc kubenswrapper[4885]: I1205 20:26:38.409493 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 20:26:38 crc kubenswrapper[4885]: I1205 20:26:38.409533 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 20:26:38 crc kubenswrapper[4885]: I1205 20:26:38.457590 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 20:26:39 crc kubenswrapper[4885]: I1205 20:26:39.422239 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="20be5ba9-3fcb-446d-bec3-eaf96556d805" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 20:26:39 crc kubenswrapper[4885]: I1205 20:26:39.422267 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="20be5ba9-3fcb-446d-bec3-eaf96556d805" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:26:41 crc kubenswrapper[4885]: I1205 20:26:41.792810 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:26:41 crc kubenswrapper[4885]: I1205 20:26:41.794226 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c2ff7d19-e58f-467f-aaed-fc34e25e6dc0" containerName="kube-state-metrics" containerID="cri-o://c656f3227fd86c984cc32aa4c4551af055f62b5b7fed31bb23e4be876f42b07e" gracePeriod=30 Dec 05 20:26:42 crc kubenswrapper[4885]: I1205 20:26:42.173387 4885 generic.go:334] "Generic (PLEG): container finished" podID="c2ff7d19-e58f-467f-aaed-fc34e25e6dc0" containerID="c656f3227fd86c984cc32aa4c4551af055f62b5b7fed31bb23e4be876f42b07e" exitCode=2 Dec 05 20:26:42 crc kubenswrapper[4885]: I1205 20:26:42.173462 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c2ff7d19-e58f-467f-aaed-fc34e25e6dc0","Type":"ContainerDied","Data":"c656f3227fd86c984cc32aa4c4551af055f62b5b7fed31bb23e4be876f42b07e"} Dec 05 20:26:42 crc kubenswrapper[4885]: I1205 20:26:42.345662 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 20:26:42 crc kubenswrapper[4885]: I1205 20:26:42.476367 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khp4x\" (UniqueName: \"kubernetes.io/projected/c2ff7d19-e58f-467f-aaed-fc34e25e6dc0-kube-api-access-khp4x\") pod \"c2ff7d19-e58f-467f-aaed-fc34e25e6dc0\" (UID: \"c2ff7d19-e58f-467f-aaed-fc34e25e6dc0\") " Dec 05 20:26:42 crc kubenswrapper[4885]: I1205 20:26:42.489225 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ff7d19-e58f-467f-aaed-fc34e25e6dc0-kube-api-access-khp4x" (OuterVolumeSpecName: "kube-api-access-khp4x") pod "c2ff7d19-e58f-467f-aaed-fc34e25e6dc0" (UID: "c2ff7d19-e58f-467f-aaed-fc34e25e6dc0"). InnerVolumeSpecName "kube-api-access-khp4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:42 crc kubenswrapper[4885]: I1205 20:26:42.578528 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khp4x\" (UniqueName: \"kubernetes.io/projected/c2ff7d19-e58f-467f-aaed-fc34e25e6dc0-kube-api-access-khp4x\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.184837 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c2ff7d19-e58f-467f-aaed-fc34e25e6dc0","Type":"ContainerDied","Data":"07a6348626dc6d688e414928c3b7eee660610dbbc8f06731c679d93f8b8a003b"} Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.184904 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.185857 4885 scope.go:117] "RemoveContainer" containerID="c656f3227fd86c984cc32aa4c4551af055f62b5b7fed31bb23e4be876f42b07e" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.226326 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.236192 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.249043 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:26:43 crc kubenswrapper[4885]: E1205 20:26:43.249651 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ff7d19-e58f-467f-aaed-fc34e25e6dc0" containerName="kube-state-metrics" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.249678 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ff7d19-e58f-467f-aaed-fc34e25e6dc0" containerName="kube-state-metrics" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.250009 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2ff7d19-e58f-467f-aaed-fc34e25e6dc0" containerName="kube-state-metrics" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.251037 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.253117 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.257927 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.260208 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.291092 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skv5j\" (UniqueName: \"kubernetes.io/projected/34d68d6f-5309-4dd5-b361-811ddff64379-kube-api-access-skv5j\") pod \"kube-state-metrics-0\" (UID: \"34d68d6f-5309-4dd5-b361-811ddff64379\") " pod="openstack/kube-state-metrics-0" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.291428 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/34d68d6f-5309-4dd5-b361-811ddff64379-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"34d68d6f-5309-4dd5-b361-811ddff64379\") " pod="openstack/kube-state-metrics-0" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.291500 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/34d68d6f-5309-4dd5-b361-811ddff64379-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"34d68d6f-5309-4dd5-b361-811ddff64379\") " pod="openstack/kube-state-metrics-0" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.292123 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d68d6f-5309-4dd5-b361-811ddff64379-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"34d68d6f-5309-4dd5-b361-811ddff64379\") " pod="openstack/kube-state-metrics-0" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.394062 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/34d68d6f-5309-4dd5-b361-811ddff64379-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"34d68d6f-5309-4dd5-b361-811ddff64379\") " pod="openstack/kube-state-metrics-0" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.394133 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/34d68d6f-5309-4dd5-b361-811ddff64379-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"34d68d6f-5309-4dd5-b361-811ddff64379\") " pod="openstack/kube-state-metrics-0" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.394198 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d68d6f-5309-4dd5-b361-811ddff64379-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"34d68d6f-5309-4dd5-b361-811ddff64379\") " pod="openstack/kube-state-metrics-0" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.394283 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skv5j\" (UniqueName: \"kubernetes.io/projected/34d68d6f-5309-4dd5-b361-811ddff64379-kube-api-access-skv5j\") pod \"kube-state-metrics-0\" (UID: \"34d68d6f-5309-4dd5-b361-811ddff64379\") " pod="openstack/kube-state-metrics-0" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.398772 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/34d68d6f-5309-4dd5-b361-811ddff64379-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"34d68d6f-5309-4dd5-b361-811ddff64379\") " pod="openstack/kube-state-metrics-0" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.398821 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/34d68d6f-5309-4dd5-b361-811ddff64379-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"34d68d6f-5309-4dd5-b361-811ddff64379\") " pod="openstack/kube-state-metrics-0" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.399876 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d68d6f-5309-4dd5-b361-811ddff64379-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"34d68d6f-5309-4dd5-b361-811ddff64379\") " pod="openstack/kube-state-metrics-0" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.412395 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skv5j\" (UniqueName: \"kubernetes.io/projected/34d68d6f-5309-4dd5-b361-811ddff64379-kube-api-access-skv5j\") pod \"kube-state-metrics-0\" (UID: \"34d68d6f-5309-4dd5-b361-811ddff64379\") " pod="openstack/kube-state-metrics-0" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.458039 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.490139 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.511834 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.512128 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerName="ceilometer-central-agent" containerID="cri-o://e41f111392e604df32d47b83bac1edfb7317e5efe6a6496250a619df930a3ebc" gracePeriod=30 Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.512192 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerName="ceilometer-notification-agent" containerID="cri-o://64001c8f95977d9d006aa1bdf92e0186a77e6651db187862588d6edb57d88429" gracePeriod=30 Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.512220 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerName="proxy-httpd" containerID="cri-o://82bd2550776c48980a4eadd10c0e1df19e101cb4a3ff0d65ea35a405e2624015" gracePeriod=30 Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.512191 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerName="sg-core" containerID="cri-o://99426b8a91ce79a2fbf126d13e9637a7ee46de85c53a0fd3b3cb46e6fff324b0" gracePeriod=30 Dec 05 20:26:43 crc kubenswrapper[4885]: I1205 20:26:43.569872 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 20:26:44 crc kubenswrapper[4885]: I1205 20:26:44.023801 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:26:44 crc kubenswrapper[4885]: I1205 20:26:44.194795 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"34d68d6f-5309-4dd5-b361-811ddff64379","Type":"ContainerStarted","Data":"75e66a569c41c697a15c6ad5eba7b4241d5e40f9a883b4831a743ccc848ca5cb"} Dec 05 20:26:44 crc kubenswrapper[4885]: I1205 20:26:44.197318 4885 generic.go:334] "Generic (PLEG): container finished" podID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerID="82bd2550776c48980a4eadd10c0e1df19e101cb4a3ff0d65ea35a405e2624015" exitCode=0 Dec 05 20:26:44 crc kubenswrapper[4885]: I1205 20:26:44.197338 4885 generic.go:334] "Generic (PLEG): container finished" podID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerID="99426b8a91ce79a2fbf126d13e9637a7ee46de85c53a0fd3b3cb46e6fff324b0" exitCode=2 Dec 05 20:26:44 crc kubenswrapper[4885]: I1205 20:26:44.197345 4885 generic.go:334] "Generic (PLEG): container finished" podID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerID="e41f111392e604df32d47b83bac1edfb7317e5efe6a6496250a619df930a3ebc" exitCode=0 Dec 05 20:26:44 crc kubenswrapper[4885]: I1205 20:26:44.198158 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ad9815-1330-4d91-aeab-4bb6540bd8bf","Type":"ContainerDied","Data":"82bd2550776c48980a4eadd10c0e1df19e101cb4a3ff0d65ea35a405e2624015"} Dec 05 20:26:44 crc kubenswrapper[4885]: I1205 20:26:44.198183 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ad9815-1330-4d91-aeab-4bb6540bd8bf","Type":"ContainerDied","Data":"99426b8a91ce79a2fbf126d13e9637a7ee46de85c53a0fd3b3cb46e6fff324b0"} Dec 05 20:26:44 crc kubenswrapper[4885]: I1205 20:26:44.198193 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ad9815-1330-4d91-aeab-4bb6540bd8bf","Type":"ContainerDied","Data":"e41f111392e604df32d47b83bac1edfb7317e5efe6a6496250a619df930a3ebc"} Dec 05 20:26:44 crc kubenswrapper[4885]: I1205 20:26:44.228624 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 20:26:44 crc kubenswrapper[4885]: I1205 20:26:44.490331 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 20:26:44 crc kubenswrapper[4885]: I1205 20:26:44.490377 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 20:26:45 crc kubenswrapper[4885]: I1205 20:26:45.184820 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2ff7d19-e58f-467f-aaed-fc34e25e6dc0" path="/var/lib/kubelet/pods/c2ff7d19-e58f-467f-aaed-fc34e25e6dc0/volumes" Dec 05 20:26:45 crc kubenswrapper[4885]: I1205 20:26:45.207975 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"34d68d6f-5309-4dd5-b361-811ddff64379","Type":"ContainerStarted","Data":"c73b4bf436b37680fc9440da8d70c5598fc4520d162d1b0638123c94b82baa3f"} Dec 05 20:26:45 crc kubenswrapper[4885]: I1205 20:26:45.208043 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 20:26:45 crc kubenswrapper[4885]: I1205 20:26:45.233276 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.86457055 podStartE2EDuration="2.233260154s" podCreationTimestamp="2025-12-05 20:26:43 +0000 UTC" firstStartedPulling="2025-12-05 20:26:44.022776391 +0000 UTC m=+1269.319592052" lastFinishedPulling="2025-12-05 20:26:44.391465995 +0000 UTC m=+1269.688281656" observedRunningTime="2025-12-05 20:26:45.229262931 +0000 UTC m=+1270.526078592" watchObservedRunningTime="2025-12-05 20:26:45.233260154 +0000 UTC m=+1270.530075815" Dec 05 20:26:45 crc kubenswrapper[4885]: I1205 20:26:45.573271 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aeb8fa29-9ae0-4760-ac67-8e0fbe016294" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:26:45 crc kubenswrapper[4885]: I1205 20:26:45.573352 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aeb8fa29-9ae0-4760-ac67-8e0fbe016294" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.221729 4885 generic.go:334] "Generic (PLEG): container finished" podID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerID="64001c8f95977d9d006aa1bdf92e0186a77e6651db187862588d6edb57d88429" exitCode=0 Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.221811 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ad9815-1330-4d91-aeab-4bb6540bd8bf","Type":"ContainerDied","Data":"64001c8f95977d9d006aa1bdf92e0186a77e6651db187862588d6edb57d88429"} Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.222061 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ad9815-1330-4d91-aeab-4bb6540bd8bf","Type":"ContainerDied","Data":"cc149308871c4c0a0b9a8cbe38acb71f893920eb66c1f9f43cf51c157dcb6b74"} Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.222079 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc149308871c4c0a0b9a8cbe38acb71f893920eb66c1f9f43cf51c157dcb6b74" Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.307103 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.367986 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-combined-ca-bundle\") pod \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.368086 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-run-httpd\") pod \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.368110 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-sg-core-conf-yaml\") pod \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.368205 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-config-data\") pod \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.368264 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-log-httpd\") pod \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.368294 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn8r7\" (UniqueName: \"kubernetes.io/projected/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-kube-api-access-hn8r7\") pod \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.368339 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-scripts\") pod \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\" (UID: \"a6ad9815-1330-4d91-aeab-4bb6540bd8bf\") " Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.369481 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a6ad9815-1330-4d91-aeab-4bb6540bd8bf" (UID: "a6ad9815-1330-4d91-aeab-4bb6540bd8bf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.370486 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a6ad9815-1330-4d91-aeab-4bb6540bd8bf" (UID: "a6ad9815-1330-4d91-aeab-4bb6540bd8bf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.376197 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-scripts" (OuterVolumeSpecName: "scripts") pod "a6ad9815-1330-4d91-aeab-4bb6540bd8bf" (UID: "a6ad9815-1330-4d91-aeab-4bb6540bd8bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.387589 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-kube-api-access-hn8r7" (OuterVolumeSpecName: "kube-api-access-hn8r7") pod "a6ad9815-1330-4d91-aeab-4bb6540bd8bf" (UID: "a6ad9815-1330-4d91-aeab-4bb6540bd8bf"). InnerVolumeSpecName "kube-api-access-hn8r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.422441 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a6ad9815-1330-4d91-aeab-4bb6540bd8bf" (UID: "a6ad9815-1330-4d91-aeab-4bb6540bd8bf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.463105 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6ad9815-1330-4d91-aeab-4bb6540bd8bf" (UID: "a6ad9815-1330-4d91-aeab-4bb6540bd8bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.470350 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.470378 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.470387 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.470401 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.470410 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn8r7\" (UniqueName: \"kubernetes.io/projected/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-kube-api-access-hn8r7\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.470420 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.493927 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-config-data" (OuterVolumeSpecName: "config-data") pod "a6ad9815-1330-4d91-aeab-4bb6540bd8bf" (UID: "a6ad9815-1330-4d91-aeab-4bb6540bd8bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:46 crc kubenswrapper[4885]: I1205 20:26:46.571849 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ad9815-1330-4d91-aeab-4bb6540bd8bf-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.229928 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.250543 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.257647 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.270688 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:26:47 crc kubenswrapper[4885]: E1205 20:26:47.271066 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerName="ceilometer-central-agent" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.271083 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerName="ceilometer-central-agent" Dec 05 20:26:47 crc kubenswrapper[4885]: E1205 20:26:47.271100 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerName="sg-core" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.271109 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerName="sg-core" Dec 05 20:26:47 crc kubenswrapper[4885]: E1205 20:26:47.271132 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerName="ceilometer-notification-agent" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.271138 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerName="ceilometer-notification-agent" Dec 05 20:26:47 crc kubenswrapper[4885]: E1205 20:26:47.271155 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerName="proxy-httpd" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.271161 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerName="proxy-httpd" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.271764 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerName="sg-core" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.271793 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerName="ceilometer-notification-agent" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.271807 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerName="proxy-httpd" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.271823 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" containerName="ceilometer-central-agent" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.273447 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.275504 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.276592 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.277182 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.281686 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.406427 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d7ae736-7995-4f53-9e77-48da9146fda8-run-httpd\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.406901 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.407262 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.407547 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ghcr\" (UniqueName: \"kubernetes.io/projected/3d7ae736-7995-4f53-9e77-48da9146fda8-kube-api-access-2ghcr\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.408861 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-config-data\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.409225 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d7ae736-7995-4f53-9e77-48da9146fda8-log-httpd\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.409535 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.409770 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-scripts\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.512846 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d7ae736-7995-4f53-9e77-48da9146fda8-run-httpd\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.512924 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.512959 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.512988 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ghcr\" (UniqueName: \"kubernetes.io/projected/3d7ae736-7995-4f53-9e77-48da9146fda8-kube-api-access-2ghcr\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.513044 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-config-data\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.513062 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d7ae736-7995-4f53-9e77-48da9146fda8-log-httpd\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.513110 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.513136 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-scripts\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.513743 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d7ae736-7995-4f53-9e77-48da9146fda8-run-httpd\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.514351 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d7ae736-7995-4f53-9e77-48da9146fda8-log-httpd\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.519236 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.522558 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-config-data\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.529073 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.529535 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-scripts\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.533012 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.535298 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ghcr\" (UniqueName: \"kubernetes.io/projected/3d7ae736-7995-4f53-9e77-48da9146fda8-kube-api-access-2ghcr\") pod \"ceilometer-0\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " pod="openstack/ceilometer-0" Dec 05 20:26:47 crc kubenswrapper[4885]: I1205 20:26:47.590208 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:26:48 crc kubenswrapper[4885]: W1205 20:26:48.089668 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d7ae736_7995_4f53_9e77_48da9146fda8.slice/crio-0de169345f0fdd6609ddb1837965e5814d9f52c79e80dfa86c1145ac4223d166 WatchSource:0}: Error finding container 0de169345f0fdd6609ddb1837965e5814d9f52c79e80dfa86c1145ac4223d166: Status 404 returned error can't find the container with id 0de169345f0fdd6609ddb1837965e5814d9f52c79e80dfa86c1145ac4223d166 Dec 05 20:26:48 crc kubenswrapper[4885]: I1205 20:26:48.099233 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:26:48 crc kubenswrapper[4885]: I1205 20:26:48.241349 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d7ae736-7995-4f53-9e77-48da9146fda8","Type":"ContainerStarted","Data":"0de169345f0fdd6609ddb1837965e5814d9f52c79e80dfa86c1145ac4223d166"} Dec 05 20:26:48 crc kubenswrapper[4885]: I1205 20:26:48.414913 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 20:26:48 crc kubenswrapper[4885]: I1205 20:26:48.416528 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 20:26:48 crc kubenswrapper[4885]: I1205 20:26:48.424830 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 20:26:49 crc kubenswrapper[4885]: I1205 20:26:49.185054 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ad9815-1330-4d91-aeab-4bb6540bd8bf" path="/var/lib/kubelet/pods/a6ad9815-1330-4d91-aeab-4bb6540bd8bf/volumes" Dec 05 20:26:49 crc kubenswrapper[4885]: I1205 20:26:49.251696 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d7ae736-7995-4f53-9e77-48da9146fda8","Type":"ContainerStarted","Data":"6d79bb4c47a9cfe5885bb13b5e4b860cf255d87e77d729673b45b16dfe054862"} Dec 05 20:26:49 crc kubenswrapper[4885]: I1205 20:26:49.257823 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 20:26:50 crc kubenswrapper[4885]: I1205 20:26:50.295526 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d7ae736-7995-4f53-9e77-48da9146fda8","Type":"ContainerStarted","Data":"9a13385ba5146482e1ce3d79a09d837232e20e1d7622901bf8f6d54d0d03433f"} Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.219913 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.307703 4885 generic.go:334] "Generic (PLEG): container finished" podID="495ff886-38af-4072-b162-8dc68cb0a0ec" containerID="f82b909bd431b876d2961953000a5180693b43d5c801b7c774021a8a9eb880ac" exitCode=137 Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.307802 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"495ff886-38af-4072-b162-8dc68cb0a0ec","Type":"ContainerDied","Data":"f82b909bd431b876d2961953000a5180693b43d5c801b7c774021a8a9eb880ac"} Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.308114 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"495ff886-38af-4072-b162-8dc68cb0a0ec","Type":"ContainerDied","Data":"2b42fcbc7e777f336e66bef4a21ba1bfe730a459fb49dcf54197ccb0639f589c"} Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.307832 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.308138 4885 scope.go:117] "RemoveContainer" containerID="f82b909bd431b876d2961953000a5180693b43d5c801b7c774021a8a9eb880ac" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.311576 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d7ae736-7995-4f53-9e77-48da9146fda8","Type":"ContainerStarted","Data":"5f0393a95995f9e28baffd6fdf0f1ab0081bd14e7ef33553c73ece93657bbb83"} Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.327863 4885 scope.go:117] "RemoveContainer" containerID="f82b909bd431b876d2961953000a5180693b43d5c801b7c774021a8a9eb880ac" Dec 05 20:26:51 crc kubenswrapper[4885]: E1205 20:26:51.328980 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f82b909bd431b876d2961953000a5180693b43d5c801b7c774021a8a9eb880ac\": container with ID starting with f82b909bd431b876d2961953000a5180693b43d5c801b7c774021a8a9eb880ac not found: ID does not exist" containerID="f82b909bd431b876d2961953000a5180693b43d5c801b7c774021a8a9eb880ac" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.329011 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f82b909bd431b876d2961953000a5180693b43d5c801b7c774021a8a9eb880ac"} err="failed to get container status \"f82b909bd431b876d2961953000a5180693b43d5c801b7c774021a8a9eb880ac\": rpc error: code = NotFound desc = could not find container \"f82b909bd431b876d2961953000a5180693b43d5c801b7c774021a8a9eb880ac\": container with ID starting with f82b909bd431b876d2961953000a5180693b43d5c801b7c774021a8a9eb880ac not found: ID does not exist" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.400454 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfhlm\" (UniqueName: \"kubernetes.io/projected/495ff886-38af-4072-b162-8dc68cb0a0ec-kube-api-access-tfhlm\") pod \"495ff886-38af-4072-b162-8dc68cb0a0ec\" (UID: \"495ff886-38af-4072-b162-8dc68cb0a0ec\") " Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.400559 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495ff886-38af-4072-b162-8dc68cb0a0ec-combined-ca-bundle\") pod \"495ff886-38af-4072-b162-8dc68cb0a0ec\" (UID: \"495ff886-38af-4072-b162-8dc68cb0a0ec\") " Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.400643 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495ff886-38af-4072-b162-8dc68cb0a0ec-config-data\") pod \"495ff886-38af-4072-b162-8dc68cb0a0ec\" (UID: \"495ff886-38af-4072-b162-8dc68cb0a0ec\") " Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.406108 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/495ff886-38af-4072-b162-8dc68cb0a0ec-kube-api-access-tfhlm" (OuterVolumeSpecName: "kube-api-access-tfhlm") pod "495ff886-38af-4072-b162-8dc68cb0a0ec" (UID: "495ff886-38af-4072-b162-8dc68cb0a0ec"). InnerVolumeSpecName "kube-api-access-tfhlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.428061 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/495ff886-38af-4072-b162-8dc68cb0a0ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "495ff886-38af-4072-b162-8dc68cb0a0ec" (UID: "495ff886-38af-4072-b162-8dc68cb0a0ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.429960 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/495ff886-38af-4072-b162-8dc68cb0a0ec-config-data" (OuterVolumeSpecName: "config-data") pod "495ff886-38af-4072-b162-8dc68cb0a0ec" (UID: "495ff886-38af-4072-b162-8dc68cb0a0ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.503745 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfhlm\" (UniqueName: \"kubernetes.io/projected/495ff886-38af-4072-b162-8dc68cb0a0ec-kube-api-access-tfhlm\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.503772 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495ff886-38af-4072-b162-8dc68cb0a0ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.503782 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495ff886-38af-4072-b162-8dc68cb0a0ec-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.661747 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.673185 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.683922 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:26:51 crc kubenswrapper[4885]: E1205 20:26:51.684453 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="495ff886-38af-4072-b162-8dc68cb0a0ec" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.684478 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="495ff886-38af-4072-b162-8dc68cb0a0ec" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.684754 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="495ff886-38af-4072-b162-8dc68cb0a0ec" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.685579 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.687597 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.688330 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.691072 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.694743 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.810524 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5627a8a-d602-4c23-bb2f-e07f9c2a8681-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5627a8a-d602-4c23-bb2f-e07f9c2a8681\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.810949 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnd7j\" (UniqueName: \"kubernetes.io/projected/d5627a8a-d602-4c23-bb2f-e07f9c2a8681-kube-api-access-mnd7j\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5627a8a-d602-4c23-bb2f-e07f9c2a8681\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.810990 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5627a8a-d602-4c23-bb2f-e07f9c2a8681-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5627a8a-d602-4c23-bb2f-e07f9c2a8681\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.811209 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5627a8a-d602-4c23-bb2f-e07f9c2a8681-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5627a8a-d602-4c23-bb2f-e07f9c2a8681\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.811238 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5627a8a-d602-4c23-bb2f-e07f9c2a8681-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5627a8a-d602-4c23-bb2f-e07f9c2a8681\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.913278 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5627a8a-d602-4c23-bb2f-e07f9c2a8681-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5627a8a-d602-4c23-bb2f-e07f9c2a8681\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.913347 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnd7j\" (UniqueName: \"kubernetes.io/projected/d5627a8a-d602-4c23-bb2f-e07f9c2a8681-kube-api-access-mnd7j\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5627a8a-d602-4c23-bb2f-e07f9c2a8681\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.913389 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5627a8a-d602-4c23-bb2f-e07f9c2a8681-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5627a8a-d602-4c23-bb2f-e07f9c2a8681\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.913442 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5627a8a-d602-4c23-bb2f-e07f9c2a8681-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5627a8a-d602-4c23-bb2f-e07f9c2a8681\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.913466 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5627a8a-d602-4c23-bb2f-e07f9c2a8681-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5627a8a-d602-4c23-bb2f-e07f9c2a8681\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.917336 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5627a8a-d602-4c23-bb2f-e07f9c2a8681-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5627a8a-d602-4c23-bb2f-e07f9c2a8681\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.917383 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5627a8a-d602-4c23-bb2f-e07f9c2a8681-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5627a8a-d602-4c23-bb2f-e07f9c2a8681\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.917414 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5627a8a-d602-4c23-bb2f-e07f9c2a8681-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5627a8a-d602-4c23-bb2f-e07f9c2a8681\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.917766 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5627a8a-d602-4c23-bb2f-e07f9c2a8681-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5627a8a-d602-4c23-bb2f-e07f9c2a8681\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:51 crc kubenswrapper[4885]: I1205 20:26:51.930056 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnd7j\" (UniqueName: \"kubernetes.io/projected/d5627a8a-d602-4c23-bb2f-e07f9c2a8681-kube-api-access-mnd7j\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5627a8a-d602-4c23-bb2f-e07f9c2a8681\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:52 crc kubenswrapper[4885]: I1205 20:26:52.009734 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:52 crc kubenswrapper[4885]: I1205 20:26:52.327926 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d7ae736-7995-4f53-9e77-48da9146fda8","Type":"ContainerStarted","Data":"3b5088b7655762e894c5483c300c45f9247d9c330fa09bc966035c2930b6e0ee"} Dec 05 20:26:52 crc kubenswrapper[4885]: I1205 20:26:52.328100 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 20:26:52 crc kubenswrapper[4885]: I1205 20:26:52.362597 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.946420526 podStartE2EDuration="5.362579447s" podCreationTimestamp="2025-12-05 20:26:47 +0000 UTC" firstStartedPulling="2025-12-05 20:26:48.093393752 +0000 UTC m=+1273.390209423" lastFinishedPulling="2025-12-05 20:26:51.509552683 +0000 UTC m=+1276.806368344" observedRunningTime="2025-12-05 20:26:52.351357922 +0000 UTC m=+1277.648173603" watchObservedRunningTime="2025-12-05 20:26:52.362579447 +0000 UTC m=+1277.659395108" Dec 05 20:26:52 crc kubenswrapper[4885]: W1205 20:26:52.471149 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5627a8a_d602_4c23_bb2f_e07f9c2a8681.slice/crio-adbfc99d65d62aa249f24daea9809a74642c3da954ccb580ada621d4d2f55a0a WatchSource:0}: Error finding container adbfc99d65d62aa249f24daea9809a74642c3da954ccb580ada621d4d2f55a0a: Status 404 returned error can't find the container with id adbfc99d65d62aa249f24daea9809a74642c3da954ccb580ada621d4d2f55a0a Dec 05 20:26:52 crc kubenswrapper[4885]: I1205 20:26:52.488550 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 20:26:53 crc kubenswrapper[4885]: I1205 20:26:53.184175 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="495ff886-38af-4072-b162-8dc68cb0a0ec" path="/var/lib/kubelet/pods/495ff886-38af-4072-b162-8dc68cb0a0ec/volumes" Dec 05 20:26:53 crc kubenswrapper[4885]: I1205 20:26:53.341130 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5627a8a-d602-4c23-bb2f-e07f9c2a8681","Type":"ContainerStarted","Data":"b5a7ffb1fdfbf4542efc57306bb8d26948ff0c4e833300ea72e009e112950e63"} Dec 05 20:26:53 crc kubenswrapper[4885]: I1205 20:26:53.341171 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5627a8a-d602-4c23-bb2f-e07f9c2a8681","Type":"ContainerStarted","Data":"adbfc99d65d62aa249f24daea9809a74642c3da954ccb580ada621d4d2f55a0a"} Dec 05 20:26:53 crc kubenswrapper[4885]: I1205 20:26:53.581267 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 20:26:53 crc kubenswrapper[4885]: I1205 20:26:53.616213 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.616193876 podStartE2EDuration="2.616193876s" podCreationTimestamp="2025-12-05 20:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:26:53.370277976 +0000 UTC m=+1278.667093637" watchObservedRunningTime="2025-12-05 20:26:53.616193876 +0000 UTC m=+1278.913009537" Dec 05 20:26:54 crc kubenswrapper[4885]: I1205 20:26:54.497013 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 20:26:54 crc kubenswrapper[4885]: I1205 20:26:54.497661 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 20:26:54 crc kubenswrapper[4885]: I1205 20:26:54.498359 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 20:26:54 crc kubenswrapper[4885]: I1205 20:26:54.500643 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.360650 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.364365 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.534161 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c8fb5597c-8bfq9"] Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.535906 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.550938 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c8fb5597c-8bfq9"] Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.698301 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.698379 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-dns-svc\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.699376 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.699569 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-config\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.699655 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gbwt\" (UniqueName: \"kubernetes.io/projected/d2a5f17e-ef46-4471-b9bc-26133ef3760c-kube-api-access-6gbwt\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.699705 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.801402 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-config\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.801475 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gbwt\" (UniqueName: \"kubernetes.io/projected/d2a5f17e-ef46-4471-b9bc-26133ef3760c-kube-api-access-6gbwt\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.801497 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.801532 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.801566 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-dns-svc\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.801609 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.802637 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.804294 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.804317 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.804709 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-config\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.804726 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-dns-svc\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.857795 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gbwt\" (UniqueName: \"kubernetes.io/projected/d2a5f17e-ef46-4471-b9bc-26133ef3760c-kube-api-access-6gbwt\") pod \"dnsmasq-dns-5c8fb5597c-8bfq9\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:55 crc kubenswrapper[4885]: I1205 20:26:55.876003 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:56 crc kubenswrapper[4885]: I1205 20:26:56.392990 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c8fb5597c-8bfq9"] Dec 05 20:26:56 crc kubenswrapper[4885]: W1205 20:26:56.400252 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2a5f17e_ef46_4471_b9bc_26133ef3760c.slice/crio-dc14dfac320afbe3b7a0aac31d17ee472bd9ca6a71a642ebe2205adb8bb70794 WatchSource:0}: Error finding container dc14dfac320afbe3b7a0aac31d17ee472bd9ca6a71a642ebe2205adb8bb70794: Status 404 returned error can't find the container with id dc14dfac320afbe3b7a0aac31d17ee472bd9ca6a71a642ebe2205adb8bb70794 Dec 05 20:26:57 crc kubenswrapper[4885]: I1205 20:26:57.009874 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:26:57 crc kubenswrapper[4885]: I1205 20:26:57.376074 4885 generic.go:334] "Generic (PLEG): container finished" podID="d2a5f17e-ef46-4471-b9bc-26133ef3760c" containerID="70e2e034599be4ed6f9b033150e871cca49087905a66f20d0342ad9bfb0f2ccc" exitCode=0 Dec 05 20:26:57 crc kubenswrapper[4885]: I1205 20:26:57.376151 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" event={"ID":"d2a5f17e-ef46-4471-b9bc-26133ef3760c","Type":"ContainerDied","Data":"70e2e034599be4ed6f9b033150e871cca49087905a66f20d0342ad9bfb0f2ccc"} Dec 05 20:26:57 crc kubenswrapper[4885]: I1205 20:26:57.376175 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" event={"ID":"d2a5f17e-ef46-4471-b9bc-26133ef3760c","Type":"ContainerStarted","Data":"dc14dfac320afbe3b7a0aac31d17ee472bd9ca6a71a642ebe2205adb8bb70794"} Dec 05 20:26:57 crc kubenswrapper[4885]: I1205 20:26:57.444379 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:26:57 crc kubenswrapper[4885]: I1205 20:26:57.444647 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerName="ceilometer-central-agent" containerID="cri-o://6d79bb4c47a9cfe5885bb13b5e4b860cf255d87e77d729673b45b16dfe054862" gracePeriod=30 Dec 05 20:26:57 crc kubenswrapper[4885]: I1205 20:26:57.445159 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerName="proxy-httpd" containerID="cri-o://3b5088b7655762e894c5483c300c45f9247d9c330fa09bc966035c2930b6e0ee" gracePeriod=30 Dec 05 20:26:57 crc kubenswrapper[4885]: I1205 20:26:57.445223 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerName="sg-core" containerID="cri-o://5f0393a95995f9e28baffd6fdf0f1ab0081bd14e7ef33553c73ece93657bbb83" gracePeriod=30 Dec 05 20:26:57 crc kubenswrapper[4885]: I1205 20:26:57.445270 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerName="ceilometer-notification-agent" containerID="cri-o://9a13385ba5146482e1ce3d79a09d837232e20e1d7622901bf8f6d54d0d03433f" gracePeriod=30 Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.099958 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.386718 4885 generic.go:334] "Generic (PLEG): container finished" podID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerID="3b5088b7655762e894c5483c300c45f9247d9c330fa09bc966035c2930b6e0ee" exitCode=0 Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.386997 4885 generic.go:334] "Generic (PLEG): container finished" podID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerID="5f0393a95995f9e28baffd6fdf0f1ab0081bd14e7ef33553c73ece93657bbb83" exitCode=2 Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.386774 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d7ae736-7995-4f53-9e77-48da9146fda8","Type":"ContainerDied","Data":"3b5088b7655762e894c5483c300c45f9247d9c330fa09bc966035c2930b6e0ee"} Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.387015 4885 generic.go:334] "Generic (PLEG): container finished" podID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerID="9a13385ba5146482e1ce3d79a09d837232e20e1d7622901bf8f6d54d0d03433f" exitCode=0 Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.387105 4885 generic.go:334] "Generic (PLEG): container finished" podID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerID="6d79bb4c47a9cfe5885bb13b5e4b860cf255d87e77d729673b45b16dfe054862" exitCode=0 Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.387059 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d7ae736-7995-4f53-9e77-48da9146fda8","Type":"ContainerDied","Data":"5f0393a95995f9e28baffd6fdf0f1ab0081bd14e7ef33553c73ece93657bbb83"} Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.387178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d7ae736-7995-4f53-9e77-48da9146fda8","Type":"ContainerDied","Data":"9a13385ba5146482e1ce3d79a09d837232e20e1d7622901bf8f6d54d0d03433f"} Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.387194 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d7ae736-7995-4f53-9e77-48da9146fda8","Type":"ContainerDied","Data":"6d79bb4c47a9cfe5885bb13b5e4b860cf255d87e77d729673b45b16dfe054862"} Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.387209 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d7ae736-7995-4f53-9e77-48da9146fda8","Type":"ContainerDied","Data":"0de169345f0fdd6609ddb1837965e5814d9f52c79e80dfa86c1145ac4223d166"} Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.387223 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0de169345f0fdd6609ddb1837965e5814d9f52c79e80dfa86c1145ac4223d166" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.388900 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.389172 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" event={"ID":"d2a5f17e-ef46-4471-b9bc-26133ef3760c","Type":"ContainerStarted","Data":"04738de63af09e27861520d526ed66607eb4fa86bf7e70cada00561f3dc9a3e7"} Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.389379 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="aeb8fa29-9ae0-4760-ac67-8e0fbe016294" containerName="nova-api-log" containerID="cri-o://7e0a0eb83667b5d9088bed550fd25b4607613a37a847116ab14a52881eabba87" gracePeriod=30 Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.389422 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="aeb8fa29-9ae0-4760-ac67-8e0fbe016294" containerName="nova-api-api" containerID="cri-o://334a467ce2169f64c8ba145459e94f01e5bb7ae530287742a7e6b582db75f856" gracePeriod=30 Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.451294 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" podStartSLOduration=3.451276439 podStartE2EDuration="3.451276439s" podCreationTimestamp="2025-12-05 20:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:26:58.450353811 +0000 UTC m=+1283.747169482" watchObservedRunningTime="2025-12-05 20:26:58.451276439 +0000 UTC m=+1283.748092100" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.554828 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d7ae736-7995-4f53-9e77-48da9146fda8-run-httpd\") pod \"3d7ae736-7995-4f53-9e77-48da9146fda8\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.554881 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ghcr\" (UniqueName: \"kubernetes.io/projected/3d7ae736-7995-4f53-9e77-48da9146fda8-kube-api-access-2ghcr\") pod \"3d7ae736-7995-4f53-9e77-48da9146fda8\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.554921 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-sg-core-conf-yaml\") pod \"3d7ae736-7995-4f53-9e77-48da9146fda8\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.555039 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-scripts\") pod \"3d7ae736-7995-4f53-9e77-48da9146fda8\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.555132 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d7ae736-7995-4f53-9e77-48da9146fda8-log-httpd\") pod \"3d7ae736-7995-4f53-9e77-48da9146fda8\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.555184 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-config-data\") pod \"3d7ae736-7995-4f53-9e77-48da9146fda8\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.555240 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-combined-ca-bundle\") pod \"3d7ae736-7995-4f53-9e77-48da9146fda8\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.555287 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-ceilometer-tls-certs\") pod \"3d7ae736-7995-4f53-9e77-48da9146fda8\" (UID: \"3d7ae736-7995-4f53-9e77-48da9146fda8\") " Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.555600 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d7ae736-7995-4f53-9e77-48da9146fda8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3d7ae736-7995-4f53-9e77-48da9146fda8" (UID: "3d7ae736-7995-4f53-9e77-48da9146fda8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.555831 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d7ae736-7995-4f53-9e77-48da9146fda8-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.556355 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d7ae736-7995-4f53-9e77-48da9146fda8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3d7ae736-7995-4f53-9e77-48da9146fda8" (UID: "3d7ae736-7995-4f53-9e77-48da9146fda8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.561540 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d7ae736-7995-4f53-9e77-48da9146fda8-kube-api-access-2ghcr" (OuterVolumeSpecName: "kube-api-access-2ghcr") pod "3d7ae736-7995-4f53-9e77-48da9146fda8" (UID: "3d7ae736-7995-4f53-9e77-48da9146fda8"). InnerVolumeSpecName "kube-api-access-2ghcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.563346 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-scripts" (OuterVolumeSpecName: "scripts") pod "3d7ae736-7995-4f53-9e77-48da9146fda8" (UID: "3d7ae736-7995-4f53-9e77-48da9146fda8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.586395 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3d7ae736-7995-4f53-9e77-48da9146fda8" (UID: "3d7ae736-7995-4f53-9e77-48da9146fda8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.609122 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3d7ae736-7995-4f53-9e77-48da9146fda8" (UID: "3d7ae736-7995-4f53-9e77-48da9146fda8"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.646251 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d7ae736-7995-4f53-9e77-48da9146fda8" (UID: "3d7ae736-7995-4f53-9e77-48da9146fda8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.657621 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ghcr\" (UniqueName: \"kubernetes.io/projected/3d7ae736-7995-4f53-9e77-48da9146fda8-kube-api-access-2ghcr\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.657654 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.657664 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.657672 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d7ae736-7995-4f53-9e77-48da9146fda8-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.657680 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.657689 4885 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.663662 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-config-data" (OuterVolumeSpecName: "config-data") pod "3d7ae736-7995-4f53-9e77-48da9146fda8" (UID: "3d7ae736-7995-4f53-9e77-48da9146fda8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:26:58 crc kubenswrapper[4885]: I1205 20:26:58.758962 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7ae736-7995-4f53-9e77-48da9146fda8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.406626 4885 generic.go:334] "Generic (PLEG): container finished" podID="aeb8fa29-9ae0-4760-ac67-8e0fbe016294" containerID="7e0a0eb83667b5d9088bed550fd25b4607613a37a847116ab14a52881eabba87" exitCode=143 Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.407000 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.406770 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aeb8fa29-9ae0-4760-ac67-8e0fbe016294","Type":"ContainerDied","Data":"7e0a0eb83667b5d9088bed550fd25b4607613a37a847116ab14a52881eabba87"} Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.408367 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.430289 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.436716 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.458605 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:26:59 crc kubenswrapper[4885]: E1205 20:26:59.459037 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerName="sg-core" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.459057 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerName="sg-core" Dec 05 20:26:59 crc kubenswrapper[4885]: E1205 20:26:59.459104 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerName="ceilometer-notification-agent" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.459112 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerName="ceilometer-notification-agent" Dec 05 20:26:59 crc kubenswrapper[4885]: E1205 20:26:59.459122 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerName="ceilometer-central-agent" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.459130 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerName="ceilometer-central-agent" Dec 05 20:26:59 crc kubenswrapper[4885]: E1205 20:26:59.459143 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerName="proxy-httpd" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.459151 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerName="proxy-httpd" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.459371 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerName="ceilometer-central-agent" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.459390 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerName="ceilometer-notification-agent" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.459402 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerName="sg-core" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.459417 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" containerName="proxy-httpd" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.465294 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.483002 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.483314 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.483455 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.485922 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a72398e-830b-402b-83c9-4ea93aa05c76-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.486006 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76bkm\" (UniqueName: \"kubernetes.io/projected/0a72398e-830b-402b-83c9-4ea93aa05c76-kube-api-access-76bkm\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.486082 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a72398e-830b-402b-83c9-4ea93aa05c76-log-httpd\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.486378 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a72398e-830b-402b-83c9-4ea93aa05c76-config-data\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.486616 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a72398e-830b-402b-83c9-4ea93aa05c76-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.486811 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a72398e-830b-402b-83c9-4ea93aa05c76-run-httpd\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.487147 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a72398e-830b-402b-83c9-4ea93aa05c76-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.487204 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a72398e-830b-402b-83c9-4ea93aa05c76-scripts\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.494505 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.588958 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a72398e-830b-402b-83c9-4ea93aa05c76-config-data\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.589057 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a72398e-830b-402b-83c9-4ea93aa05c76-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.589102 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a72398e-830b-402b-83c9-4ea93aa05c76-run-httpd\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.589136 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a72398e-830b-402b-83c9-4ea93aa05c76-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.589163 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a72398e-830b-402b-83c9-4ea93aa05c76-scripts\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.589231 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a72398e-830b-402b-83c9-4ea93aa05c76-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.589264 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76bkm\" (UniqueName: \"kubernetes.io/projected/0a72398e-830b-402b-83c9-4ea93aa05c76-kube-api-access-76bkm\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.589294 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a72398e-830b-402b-83c9-4ea93aa05c76-log-httpd\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.591272 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a72398e-830b-402b-83c9-4ea93aa05c76-run-httpd\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.591592 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a72398e-830b-402b-83c9-4ea93aa05c76-log-httpd\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.595084 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a72398e-830b-402b-83c9-4ea93aa05c76-scripts\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.595695 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a72398e-830b-402b-83c9-4ea93aa05c76-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.595827 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a72398e-830b-402b-83c9-4ea93aa05c76-config-data\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.599597 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a72398e-830b-402b-83c9-4ea93aa05c76-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.600859 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a72398e-830b-402b-83c9-4ea93aa05c76-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.607996 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76bkm\" (UniqueName: \"kubernetes.io/projected/0a72398e-830b-402b-83c9-4ea93aa05c76-kube-api-access-76bkm\") pod \"ceilometer-0\" (UID: \"0a72398e-830b-402b-83c9-4ea93aa05c76\") " pod="openstack/ceilometer-0" Dec 05 20:26:59 crc kubenswrapper[4885]: I1205 20:26:59.820983 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 20:27:00 crc kubenswrapper[4885]: W1205 20:27:00.264992 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a72398e_830b_402b_83c9_4ea93aa05c76.slice/crio-f8c5619ddd79442553d9d18279c7f29a4a88abdd758b0d3e1770f40b409625eb WatchSource:0}: Error finding container f8c5619ddd79442553d9d18279c7f29a4a88abdd758b0d3e1770f40b409625eb: Status 404 returned error can't find the container with id f8c5619ddd79442553d9d18279c7f29a4a88abdd758b0d3e1770f40b409625eb Dec 05 20:27:00 crc kubenswrapper[4885]: I1205 20:27:00.266900 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 20:27:00 crc kubenswrapper[4885]: I1205 20:27:00.419092 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a72398e-830b-402b-83c9-4ea93aa05c76","Type":"ContainerStarted","Data":"f8c5619ddd79442553d9d18279c7f29a4a88abdd758b0d3e1770f40b409625eb"} Dec 05 20:27:01 crc kubenswrapper[4885]: I1205 20:27:01.189256 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d7ae736-7995-4f53-9e77-48da9146fda8" path="/var/lib/kubelet/pods/3d7ae736-7995-4f53-9e77-48da9146fda8/volumes" Dec 05 20:27:01 crc kubenswrapper[4885]: I1205 20:27:01.443744 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a72398e-830b-402b-83c9-4ea93aa05c76","Type":"ContainerStarted","Data":"90712be07560cb5ab38c90c0d8a346dc173638128e4c6ce4740cd72de9681449"} Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.010328 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.030447 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.035337 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.147234 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-config-data\") pod \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\" (UID: \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\") " Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.147363 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqhxb\" (UniqueName: \"kubernetes.io/projected/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-kube-api-access-rqhxb\") pod \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\" (UID: \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\") " Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.147422 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-combined-ca-bundle\") pod \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\" (UID: \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\") " Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.147512 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-logs\") pod \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\" (UID: \"aeb8fa29-9ae0-4760-ac67-8e0fbe016294\") " Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.148431 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-logs" (OuterVolumeSpecName: "logs") pod "aeb8fa29-9ae0-4760-ac67-8e0fbe016294" (UID: "aeb8fa29-9ae0-4760-ac67-8e0fbe016294"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.152564 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-kube-api-access-rqhxb" (OuterVolumeSpecName: "kube-api-access-rqhxb") pod "aeb8fa29-9ae0-4760-ac67-8e0fbe016294" (UID: "aeb8fa29-9ae0-4760-ac67-8e0fbe016294"). InnerVolumeSpecName "kube-api-access-rqhxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.179751 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeb8fa29-9ae0-4760-ac67-8e0fbe016294" (UID: "aeb8fa29-9ae0-4760-ac67-8e0fbe016294"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.188455 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-config-data" (OuterVolumeSpecName: "config-data") pod "aeb8fa29-9ae0-4760-ac67-8e0fbe016294" (UID: "aeb8fa29-9ae0-4760-ac67-8e0fbe016294"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.249930 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqhxb\" (UniqueName: \"kubernetes.io/projected/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-kube-api-access-rqhxb\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.249972 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.249984 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.249993 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb8fa29-9ae0-4760-ac67-8e0fbe016294-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.458728 4885 generic.go:334] "Generic (PLEG): container finished" podID="aeb8fa29-9ae0-4760-ac67-8e0fbe016294" containerID="334a467ce2169f64c8ba145459e94f01e5bb7ae530287742a7e6b582db75f856" exitCode=0 Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.458764 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aeb8fa29-9ae0-4760-ac67-8e0fbe016294","Type":"ContainerDied","Data":"334a467ce2169f64c8ba145459e94f01e5bb7ae530287742a7e6b582db75f856"} Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.458791 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.458817 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aeb8fa29-9ae0-4760-ac67-8e0fbe016294","Type":"ContainerDied","Data":"a1d713781c909135a25e7ba43029f43bce287c264b64430349dffbc96bde02ea"} Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.458837 4885 scope.go:117] "RemoveContainer" containerID="334a467ce2169f64c8ba145459e94f01e5bb7ae530287742a7e6b582db75f856" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.464226 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a72398e-830b-402b-83c9-4ea93aa05c76","Type":"ContainerStarted","Data":"2e3cf6f7a938b08a40dd2c730582950bff9110834d745fc1c65b9b835e5a208d"} Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.464283 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a72398e-830b-402b-83c9-4ea93aa05c76","Type":"ContainerStarted","Data":"8991fbc67c4530cab7cad4e0292a84c922e521ca8cc8f70172a2d39f5e0d8234"} Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.484036 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.490061 4885 scope.go:117] "RemoveContainer" containerID="7e0a0eb83667b5d9088bed550fd25b4607613a37a847116ab14a52881eabba87" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.508895 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.534413 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.549736 4885 scope.go:117] "RemoveContainer" containerID="334a467ce2169f64c8ba145459e94f01e5bb7ae530287742a7e6b582db75f856" Dec 05 20:27:02 crc kubenswrapper[4885]: E1205 20:27:02.550406 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"334a467ce2169f64c8ba145459e94f01e5bb7ae530287742a7e6b582db75f856\": container with ID starting with 334a467ce2169f64c8ba145459e94f01e5bb7ae530287742a7e6b582db75f856 not found: ID does not exist" containerID="334a467ce2169f64c8ba145459e94f01e5bb7ae530287742a7e6b582db75f856" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.550453 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"334a467ce2169f64c8ba145459e94f01e5bb7ae530287742a7e6b582db75f856"} err="failed to get container status \"334a467ce2169f64c8ba145459e94f01e5bb7ae530287742a7e6b582db75f856\": rpc error: code = NotFound desc = could not find container \"334a467ce2169f64c8ba145459e94f01e5bb7ae530287742a7e6b582db75f856\": container with ID starting with 334a467ce2169f64c8ba145459e94f01e5bb7ae530287742a7e6b582db75f856 not found: ID does not exist" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.550479 4885 scope.go:117] "RemoveContainer" containerID="7e0a0eb83667b5d9088bed550fd25b4607613a37a847116ab14a52881eabba87" Dec 05 20:27:02 crc kubenswrapper[4885]: E1205 20:27:02.550999 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e0a0eb83667b5d9088bed550fd25b4607613a37a847116ab14a52881eabba87\": container with ID starting with 7e0a0eb83667b5d9088bed550fd25b4607613a37a847116ab14a52881eabba87 not found: ID does not exist" containerID="7e0a0eb83667b5d9088bed550fd25b4607613a37a847116ab14a52881eabba87" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.551100 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e0a0eb83667b5d9088bed550fd25b4607613a37a847116ab14a52881eabba87"} err="failed to get container status \"7e0a0eb83667b5d9088bed550fd25b4607613a37a847116ab14a52881eabba87\": rpc error: code = NotFound desc = could not find container \"7e0a0eb83667b5d9088bed550fd25b4607613a37a847116ab14a52881eabba87\": container with ID starting with 7e0a0eb83667b5d9088bed550fd25b4607613a37a847116ab14a52881eabba87 not found: ID does not exist" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.552966 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 20:27:02 crc kubenswrapper[4885]: E1205 20:27:02.553663 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb8fa29-9ae0-4760-ac67-8e0fbe016294" containerName="nova-api-api" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.553688 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb8fa29-9ae0-4760-ac67-8e0fbe016294" containerName="nova-api-api" Dec 05 20:27:02 crc kubenswrapper[4885]: E1205 20:27:02.553737 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb8fa29-9ae0-4760-ac67-8e0fbe016294" containerName="nova-api-log" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.553745 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb8fa29-9ae0-4760-ac67-8e0fbe016294" containerName="nova-api-log" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.553995 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb8fa29-9ae0-4760-ac67-8e0fbe016294" containerName="nova-api-log" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.554028 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb8fa29-9ae0-4760-ac67-8e0fbe016294" containerName="nova-api-api" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.555685 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.558870 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.559144 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.559373 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.594149 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.659602 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-config-data\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.659668 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-logs\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.659707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.659741 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-public-tls-certs\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.659767 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.659880 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8cpf\" (UniqueName: \"kubernetes.io/projected/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-kube-api-access-j8cpf\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.761785 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-config-data\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.761837 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-logs\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.761873 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.761905 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-public-tls-certs\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.761930 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.761962 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8cpf\" (UniqueName: \"kubernetes.io/projected/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-kube-api-access-j8cpf\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.763570 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-logs\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.770270 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.771122 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-public-tls-certs\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.775363 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-config-data\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.775916 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.783502 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8cpf\" (UniqueName: \"kubernetes.io/projected/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-kube-api-access-j8cpf\") pod \"nova-api-0\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.801766 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-lg8pc"] Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.803809 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lg8pc" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.805737 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.805985 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.813295 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-lg8pc"] Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.863723 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txhc4\" (UniqueName: \"kubernetes.io/projected/7a7f1297-73c8-4b59-99c9-386d4b5483a1-kube-api-access-txhc4\") pod \"nova-cell1-cell-mapping-lg8pc\" (UID: \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\") " pod="openstack/nova-cell1-cell-mapping-lg8pc" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.863863 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-lg8pc\" (UID: \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\") " pod="openstack/nova-cell1-cell-mapping-lg8pc" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.863919 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-config-data\") pod \"nova-cell1-cell-mapping-lg8pc\" (UID: \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\") " pod="openstack/nova-cell1-cell-mapping-lg8pc" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.863986 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-scripts\") pod \"nova-cell1-cell-mapping-lg8pc\" (UID: \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\") " pod="openstack/nova-cell1-cell-mapping-lg8pc" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.904994 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.966228 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-lg8pc\" (UID: \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\") " pod="openstack/nova-cell1-cell-mapping-lg8pc" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.966313 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-config-data\") pod \"nova-cell1-cell-mapping-lg8pc\" (UID: \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\") " pod="openstack/nova-cell1-cell-mapping-lg8pc" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.966379 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-scripts\") pod \"nova-cell1-cell-mapping-lg8pc\" (UID: \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\") " pod="openstack/nova-cell1-cell-mapping-lg8pc" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.966455 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txhc4\" (UniqueName: \"kubernetes.io/projected/7a7f1297-73c8-4b59-99c9-386d4b5483a1-kube-api-access-txhc4\") pod \"nova-cell1-cell-mapping-lg8pc\" (UID: \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\") " pod="openstack/nova-cell1-cell-mapping-lg8pc" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.971041 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-scripts\") pod \"nova-cell1-cell-mapping-lg8pc\" (UID: \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\") " pod="openstack/nova-cell1-cell-mapping-lg8pc" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.971601 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-config-data\") pod \"nova-cell1-cell-mapping-lg8pc\" (UID: \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\") " pod="openstack/nova-cell1-cell-mapping-lg8pc" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.971918 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-lg8pc\" (UID: \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\") " pod="openstack/nova-cell1-cell-mapping-lg8pc" Dec 05 20:27:02 crc kubenswrapper[4885]: I1205 20:27:02.986511 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txhc4\" (UniqueName: \"kubernetes.io/projected/7a7f1297-73c8-4b59-99c9-386d4b5483a1-kube-api-access-txhc4\") pod \"nova-cell1-cell-mapping-lg8pc\" (UID: \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\") " pod="openstack/nova-cell1-cell-mapping-lg8pc" Dec 05 20:27:03 crc kubenswrapper[4885]: I1205 20:27:03.127575 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lg8pc" Dec 05 20:27:03 crc kubenswrapper[4885]: I1205 20:27:03.215654 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeb8fa29-9ae0-4760-ac67-8e0fbe016294" path="/var/lib/kubelet/pods/aeb8fa29-9ae0-4760-ac67-8e0fbe016294/volumes" Dec 05 20:27:03 crc kubenswrapper[4885]: I1205 20:27:03.408750 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:27:03 crc kubenswrapper[4885]: W1205 20:27:03.410341 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode83a4721_02e6_4b61_a6bd_bfab8a756e2d.slice/crio-1c5367f93ea4123fae3ce9a8bb56016127a6849d73cb308ecb17ea80fadd50db WatchSource:0}: Error finding container 1c5367f93ea4123fae3ce9a8bb56016127a6849d73cb308ecb17ea80fadd50db: Status 404 returned error can't find the container with id 1c5367f93ea4123fae3ce9a8bb56016127a6849d73cb308ecb17ea80fadd50db Dec 05 20:27:03 crc kubenswrapper[4885]: I1205 20:27:03.481488 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e83a4721-02e6-4b61-a6bd-bfab8a756e2d","Type":"ContainerStarted","Data":"1c5367f93ea4123fae3ce9a8bb56016127a6849d73cb308ecb17ea80fadd50db"} Dec 05 20:27:03 crc kubenswrapper[4885]: I1205 20:27:03.621503 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-lg8pc"] Dec 05 20:27:03 crc kubenswrapper[4885]: W1205 20:27:03.636190 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a7f1297_73c8_4b59_99c9_386d4b5483a1.slice/crio-e16ec4c9a505b8824b5284974cdb93283ed50f2c29ae7654b8b1af7b3d968d85 WatchSource:0}: Error finding container e16ec4c9a505b8824b5284974cdb93283ed50f2c29ae7654b8b1af7b3d968d85: Status 404 returned error can't find the container with id e16ec4c9a505b8824b5284974cdb93283ed50f2c29ae7654b8b1af7b3d968d85 Dec 05 20:27:04 crc kubenswrapper[4885]: I1205 20:27:04.490771 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lg8pc" event={"ID":"7a7f1297-73c8-4b59-99c9-386d4b5483a1","Type":"ContainerStarted","Data":"7ffecd4c7a6cf04b6c732f34f34da32565af45ee5c40465741a1ce8297e28b53"} Dec 05 20:27:04 crc kubenswrapper[4885]: I1205 20:27:04.491145 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lg8pc" event={"ID":"7a7f1297-73c8-4b59-99c9-386d4b5483a1","Type":"ContainerStarted","Data":"e16ec4c9a505b8824b5284974cdb93283ed50f2c29ae7654b8b1af7b3d968d85"} Dec 05 20:27:04 crc kubenswrapper[4885]: I1205 20:27:04.495625 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e83a4721-02e6-4b61-a6bd-bfab8a756e2d","Type":"ContainerStarted","Data":"2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764"} Dec 05 20:27:04 crc kubenswrapper[4885]: I1205 20:27:04.495660 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e83a4721-02e6-4b61-a6bd-bfab8a756e2d","Type":"ContainerStarted","Data":"ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64"} Dec 05 20:27:04 crc kubenswrapper[4885]: I1205 20:27:04.500993 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a72398e-830b-402b-83c9-4ea93aa05c76","Type":"ContainerStarted","Data":"a33ce0040a6d00e768162293bef6c18b01b79ed92f7890d6fb77e0c3ca4e184a"} Dec 05 20:27:04 crc kubenswrapper[4885]: I1205 20:27:04.501760 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 20:27:04 crc kubenswrapper[4885]: I1205 20:27:04.789771 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-lg8pc" podStartSLOduration=2.78975251 podStartE2EDuration="2.78975251s" podCreationTimestamp="2025-12-05 20:27:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:27:04.774699657 +0000 UTC m=+1290.071515318" watchObservedRunningTime="2025-12-05 20:27:04.78975251 +0000 UTC m=+1290.086568171" Dec 05 20:27:04 crc kubenswrapper[4885]: I1205 20:27:04.813570 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.641131714 podStartE2EDuration="5.813550592s" podCreationTimestamp="2025-12-05 20:26:59 +0000 UTC" firstStartedPulling="2025-12-05 20:27:00.268150244 +0000 UTC m=+1285.564965905" lastFinishedPulling="2025-12-05 20:27:03.440569122 +0000 UTC m=+1288.737384783" observedRunningTime="2025-12-05 20:27:04.800716617 +0000 UTC m=+1290.097532308" watchObservedRunningTime="2025-12-05 20:27:04.813550592 +0000 UTC m=+1290.110366263" Dec 05 20:27:04 crc kubenswrapper[4885]: I1205 20:27:04.826908 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.826889181 podStartE2EDuration="2.826889181s" podCreationTimestamp="2025-12-05 20:27:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:27:04.82296027 +0000 UTC m=+1290.119775941" watchObservedRunningTime="2025-12-05 20:27:04.826889181 +0000 UTC m=+1290.123704842" Dec 05 20:27:05 crc kubenswrapper[4885]: I1205 20:27:05.877906 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:27:05 crc kubenswrapper[4885]: I1205 20:27:05.984276 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9ff45c7-dlf42"] Dec 05 20:27:05 crc kubenswrapper[4885]: I1205 20:27:05.984547 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" podUID="0a873296-2fb6-42f4-b88b-30a8292bc14e" containerName="dnsmasq-dns" containerID="cri-o://b1fea74976e6df29d90d0e3f5be78c77a3cb060bb0d5fa21d448360903644735" gracePeriod=10 Dec 05 20:27:06 crc kubenswrapper[4885]: E1205 20:27:06.237851 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a873296_2fb6_42f4_b88b_30a8292bc14e.slice/crio-b1fea74976e6df29d90d0e3f5be78c77a3cb060bb0d5fa21d448360903644735.scope\": RecentStats: unable to find data in memory cache]" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.531431 4885 generic.go:334] "Generic (PLEG): container finished" podID="0a873296-2fb6-42f4-b88b-30a8292bc14e" containerID="b1fea74976e6df29d90d0e3f5be78c77a3cb060bb0d5fa21d448360903644735" exitCode=0 Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.531494 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.531507 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" event={"ID":"0a873296-2fb6-42f4-b88b-30a8292bc14e","Type":"ContainerDied","Data":"b1fea74976e6df29d90d0e3f5be78c77a3cb060bb0d5fa21d448360903644735"} Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.531825 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" event={"ID":"0a873296-2fb6-42f4-b88b-30a8292bc14e","Type":"ContainerDied","Data":"26d33f997234f1dc529791db5dff3523fe6a8d946a62c3c417163b6e5d76e6a4"} Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.531848 4885 scope.go:117] "RemoveContainer" containerID="b1fea74976e6df29d90d0e3f5be78c77a3cb060bb0d5fa21d448360903644735" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.552640 4885 scope.go:117] "RemoveContainer" containerID="7ab16f8f140013f50d130ab6e5099dceb8c6acd299e4208c1a8f177aa359d346" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.602285 4885 scope.go:117] "RemoveContainer" containerID="b1fea74976e6df29d90d0e3f5be78c77a3cb060bb0d5fa21d448360903644735" Dec 05 20:27:06 crc kubenswrapper[4885]: E1205 20:27:06.602805 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1fea74976e6df29d90d0e3f5be78c77a3cb060bb0d5fa21d448360903644735\": container with ID starting with b1fea74976e6df29d90d0e3f5be78c77a3cb060bb0d5fa21d448360903644735 not found: ID does not exist" containerID="b1fea74976e6df29d90d0e3f5be78c77a3cb060bb0d5fa21d448360903644735" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.602837 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1fea74976e6df29d90d0e3f5be78c77a3cb060bb0d5fa21d448360903644735"} err="failed to get container status \"b1fea74976e6df29d90d0e3f5be78c77a3cb060bb0d5fa21d448360903644735\": rpc error: code = NotFound desc = could not find container \"b1fea74976e6df29d90d0e3f5be78c77a3cb060bb0d5fa21d448360903644735\": container with ID starting with b1fea74976e6df29d90d0e3f5be78c77a3cb060bb0d5fa21d448360903644735 not found: ID does not exist" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.602857 4885 scope.go:117] "RemoveContainer" containerID="7ab16f8f140013f50d130ab6e5099dceb8c6acd299e4208c1a8f177aa359d346" Dec 05 20:27:06 crc kubenswrapper[4885]: E1205 20:27:06.603260 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ab16f8f140013f50d130ab6e5099dceb8c6acd299e4208c1a8f177aa359d346\": container with ID starting with 7ab16f8f140013f50d130ab6e5099dceb8c6acd299e4208c1a8f177aa359d346 not found: ID does not exist" containerID="7ab16f8f140013f50d130ab6e5099dceb8c6acd299e4208c1a8f177aa359d346" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.603297 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ab16f8f140013f50d130ab6e5099dceb8c6acd299e4208c1a8f177aa359d346"} err="failed to get container status \"7ab16f8f140013f50d130ab6e5099dceb8c6acd299e4208c1a8f177aa359d346\": rpc error: code = NotFound desc = could not find container \"7ab16f8f140013f50d130ab6e5099dceb8c6acd299e4208c1a8f177aa359d346\": container with ID starting with 7ab16f8f140013f50d130ab6e5099dceb8c6acd299e4208c1a8f177aa359d346 not found: ID does not exist" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.649865 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-ovsdbserver-sb\") pod \"0a873296-2fb6-42f4-b88b-30a8292bc14e\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.649901 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-dns-swift-storage-0\") pod \"0a873296-2fb6-42f4-b88b-30a8292bc14e\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.649919 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-ovsdbserver-nb\") pod \"0a873296-2fb6-42f4-b88b-30a8292bc14e\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.649954 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-dns-svc\") pod \"0a873296-2fb6-42f4-b88b-30a8292bc14e\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.650078 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-config\") pod \"0a873296-2fb6-42f4-b88b-30a8292bc14e\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.650148 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb5p6\" (UniqueName: \"kubernetes.io/projected/0a873296-2fb6-42f4-b88b-30a8292bc14e-kube-api-access-sb5p6\") pod \"0a873296-2fb6-42f4-b88b-30a8292bc14e\" (UID: \"0a873296-2fb6-42f4-b88b-30a8292bc14e\") " Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.656199 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a873296-2fb6-42f4-b88b-30a8292bc14e-kube-api-access-sb5p6" (OuterVolumeSpecName: "kube-api-access-sb5p6") pod "0a873296-2fb6-42f4-b88b-30a8292bc14e" (UID: "0a873296-2fb6-42f4-b88b-30a8292bc14e"). InnerVolumeSpecName "kube-api-access-sb5p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.703078 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-config" (OuterVolumeSpecName: "config") pod "0a873296-2fb6-42f4-b88b-30a8292bc14e" (UID: "0a873296-2fb6-42f4-b88b-30a8292bc14e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.706337 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0a873296-2fb6-42f4-b88b-30a8292bc14e" (UID: "0a873296-2fb6-42f4-b88b-30a8292bc14e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.717094 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0a873296-2fb6-42f4-b88b-30a8292bc14e" (UID: "0a873296-2fb6-42f4-b88b-30a8292bc14e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.726226 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0a873296-2fb6-42f4-b88b-30a8292bc14e" (UID: "0a873296-2fb6-42f4-b88b-30a8292bc14e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.731877 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0a873296-2fb6-42f4-b88b-30a8292bc14e" (UID: "0a873296-2fb6-42f4-b88b-30a8292bc14e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.753103 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.753139 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.753151 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb5p6\" (UniqueName: \"kubernetes.io/projected/0a873296-2fb6-42f4-b88b-30a8292bc14e-kube-api-access-sb5p6\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.753162 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.753170 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:06 crc kubenswrapper[4885]: I1205 20:27:06.753178 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a873296-2fb6-42f4-b88b-30a8292bc14e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:07 crc kubenswrapper[4885]: I1205 20:27:07.540869 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9ff45c7-dlf42" Dec 05 20:27:07 crc kubenswrapper[4885]: I1205 20:27:07.563113 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9ff45c7-dlf42"] Dec 05 20:27:07 crc kubenswrapper[4885]: I1205 20:27:07.572866 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b9ff45c7-dlf42"] Dec 05 20:27:09 crc kubenswrapper[4885]: I1205 20:27:09.185235 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a873296-2fb6-42f4-b88b-30a8292bc14e" path="/var/lib/kubelet/pods/0a873296-2fb6-42f4-b88b-30a8292bc14e/volumes" Dec 05 20:27:09 crc kubenswrapper[4885]: I1205 20:27:09.584073 4885 generic.go:334] "Generic (PLEG): container finished" podID="7a7f1297-73c8-4b59-99c9-386d4b5483a1" containerID="7ffecd4c7a6cf04b6c732f34f34da32565af45ee5c40465741a1ce8297e28b53" exitCode=0 Dec 05 20:27:09 crc kubenswrapper[4885]: I1205 20:27:09.584190 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lg8pc" event={"ID":"7a7f1297-73c8-4b59-99c9-386d4b5483a1","Type":"ContainerDied","Data":"7ffecd4c7a6cf04b6c732f34f34da32565af45ee5c40465741a1ce8297e28b53"} Dec 05 20:27:10 crc kubenswrapper[4885]: I1205 20:27:10.988143 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lg8pc" Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.035852 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txhc4\" (UniqueName: \"kubernetes.io/projected/7a7f1297-73c8-4b59-99c9-386d4b5483a1-kube-api-access-txhc4\") pod \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\" (UID: \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\") " Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.036072 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-config-data\") pod \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\" (UID: \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\") " Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.036125 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-scripts\") pod \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\" (UID: \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\") " Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.036173 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-combined-ca-bundle\") pod \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\" (UID: \"7a7f1297-73c8-4b59-99c9-386d4b5483a1\") " Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.040976 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a7f1297-73c8-4b59-99c9-386d4b5483a1-kube-api-access-txhc4" (OuterVolumeSpecName: "kube-api-access-txhc4") pod "7a7f1297-73c8-4b59-99c9-386d4b5483a1" (UID: "7a7f1297-73c8-4b59-99c9-386d4b5483a1"). InnerVolumeSpecName "kube-api-access-txhc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.041891 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-scripts" (OuterVolumeSpecName: "scripts") pod "7a7f1297-73c8-4b59-99c9-386d4b5483a1" (UID: "7a7f1297-73c8-4b59-99c9-386d4b5483a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.063736 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a7f1297-73c8-4b59-99c9-386d4b5483a1" (UID: "7a7f1297-73c8-4b59-99c9-386d4b5483a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.064168 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-config-data" (OuterVolumeSpecName: "config-data") pod "7a7f1297-73c8-4b59-99c9-386d4b5483a1" (UID: "7a7f1297-73c8-4b59-99c9-386d4b5483a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.139153 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.139293 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.139316 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7f1297-73c8-4b59-99c9-386d4b5483a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.139330 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txhc4\" (UniqueName: \"kubernetes.io/projected/7a7f1297-73c8-4b59-99c9-386d4b5483a1-kube-api-access-txhc4\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.616983 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lg8pc" event={"ID":"7a7f1297-73c8-4b59-99c9-386d4b5483a1","Type":"ContainerDied","Data":"e16ec4c9a505b8824b5284974cdb93283ed50f2c29ae7654b8b1af7b3d968d85"} Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.617486 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e16ec4c9a505b8824b5284974cdb93283ed50f2c29ae7654b8b1af7b3d968d85" Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.617153 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lg8pc" Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.781033 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.781247 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e83a4721-02e6-4b61-a6bd-bfab8a756e2d" containerName="nova-api-log" containerID="cri-o://ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64" gracePeriod=30 Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.781318 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e83a4721-02e6-4b61-a6bd-bfab8a756e2d" containerName="nova-api-api" containerID="cri-o://2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764" gracePeriod=30 Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.843526 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.843738 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0885bbfa-d44b-4e51-948b-8089bbb49c7b" containerName="nova-scheduler-scheduler" containerID="cri-o://4ef80d3f7dab56a228bb64695c7c7c618c891820b73c90c741faf5395897ff26" gracePeriod=30 Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.865861 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.866559 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="20be5ba9-3fcb-446d-bec3-eaf96556d805" containerName="nova-metadata-log" containerID="cri-o://033add53f2a15716fe27fb06e12fdbb503d21d3dbe93e1ab0443a11dbe23c319" gracePeriod=30 Dec 05 20:27:11 crc kubenswrapper[4885]: I1205 20:27:11.866646 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="20be5ba9-3fcb-446d-bec3-eaf96556d805" containerName="nova-metadata-metadata" containerID="cri-o://fd485a8695d5a55319bbcd91e05f32caed1f175a66d999a7ab4dca5e6d0552f4" gracePeriod=30 Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.261179 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.361610 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-internal-tls-certs\") pod \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.361661 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8cpf\" (UniqueName: \"kubernetes.io/projected/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-kube-api-access-j8cpf\") pod \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.361743 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-combined-ca-bundle\") pod \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.361774 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-config-data\") pod \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.361841 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-logs\") pod \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.361868 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-public-tls-certs\") pod \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\" (UID: \"e83a4721-02e6-4b61-a6bd-bfab8a756e2d\") " Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.362608 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-logs" (OuterVolumeSpecName: "logs") pod "e83a4721-02e6-4b61-a6bd-bfab8a756e2d" (UID: "e83a4721-02e6-4b61-a6bd-bfab8a756e2d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.366860 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-kube-api-access-j8cpf" (OuterVolumeSpecName: "kube-api-access-j8cpf") pod "e83a4721-02e6-4b61-a6bd-bfab8a756e2d" (UID: "e83a4721-02e6-4b61-a6bd-bfab8a756e2d"). InnerVolumeSpecName "kube-api-access-j8cpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.392734 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-config-data" (OuterVolumeSpecName: "config-data") pod "e83a4721-02e6-4b61-a6bd-bfab8a756e2d" (UID: "e83a4721-02e6-4b61-a6bd-bfab8a756e2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.395298 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e83a4721-02e6-4b61-a6bd-bfab8a756e2d" (UID: "e83a4721-02e6-4b61-a6bd-bfab8a756e2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.410228 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e83a4721-02e6-4b61-a6bd-bfab8a756e2d" (UID: "e83a4721-02e6-4b61-a6bd-bfab8a756e2d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.415252 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e83a4721-02e6-4b61-a6bd-bfab8a756e2d" (UID: "e83a4721-02e6-4b61-a6bd-bfab8a756e2d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.464004 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.464062 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8cpf\" (UniqueName: \"kubernetes.io/projected/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-kube-api-access-j8cpf\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.464074 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.464083 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.464092 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.464101 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e83a4721-02e6-4b61-a6bd-bfab8a756e2d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.627197 4885 generic.go:334] "Generic (PLEG): container finished" podID="20be5ba9-3fcb-446d-bec3-eaf96556d805" containerID="033add53f2a15716fe27fb06e12fdbb503d21d3dbe93e1ab0443a11dbe23c319" exitCode=143 Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.627265 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20be5ba9-3fcb-446d-bec3-eaf96556d805","Type":"ContainerDied","Data":"033add53f2a15716fe27fb06e12fdbb503d21d3dbe93e1ab0443a11dbe23c319"} Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.631008 4885 generic.go:334] "Generic (PLEG): container finished" podID="e83a4721-02e6-4b61-a6bd-bfab8a756e2d" containerID="2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764" exitCode=0 Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.631070 4885 generic.go:334] "Generic (PLEG): container finished" podID="e83a4721-02e6-4b61-a6bd-bfab8a756e2d" containerID="ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64" exitCode=143 Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.631088 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e83a4721-02e6-4b61-a6bd-bfab8a756e2d","Type":"ContainerDied","Data":"2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764"} Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.631110 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e83a4721-02e6-4b61-a6bd-bfab8a756e2d","Type":"ContainerDied","Data":"ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64"} Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.631123 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e83a4721-02e6-4b61-a6bd-bfab8a756e2d","Type":"ContainerDied","Data":"1c5367f93ea4123fae3ce9a8bb56016127a6849d73cb308ecb17ea80fadd50db"} Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.631141 4885 scope.go:117] "RemoveContainer" containerID="2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.631290 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.661947 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.665204 4885 scope.go:117] "RemoveContainer" containerID="ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.682598 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.695080 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 20:27:12 crc kubenswrapper[4885]: E1205 20:27:12.695540 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83a4721-02e6-4b61-a6bd-bfab8a756e2d" containerName="nova-api-log" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.695562 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83a4721-02e6-4b61-a6bd-bfab8a756e2d" containerName="nova-api-log" Dec 05 20:27:12 crc kubenswrapper[4885]: E1205 20:27:12.695586 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a873296-2fb6-42f4-b88b-30a8292bc14e" containerName="init" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.695603 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a873296-2fb6-42f4-b88b-30a8292bc14e" containerName="init" Dec 05 20:27:12 crc kubenswrapper[4885]: E1205 20:27:12.695619 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a873296-2fb6-42f4-b88b-30a8292bc14e" containerName="dnsmasq-dns" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.695626 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a873296-2fb6-42f4-b88b-30a8292bc14e" containerName="dnsmasq-dns" Dec 05 20:27:12 crc kubenswrapper[4885]: E1205 20:27:12.695643 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83a4721-02e6-4b61-a6bd-bfab8a756e2d" containerName="nova-api-api" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.695648 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83a4721-02e6-4b61-a6bd-bfab8a756e2d" containerName="nova-api-api" Dec 05 20:27:12 crc kubenswrapper[4885]: E1205 20:27:12.695675 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a7f1297-73c8-4b59-99c9-386d4b5483a1" containerName="nova-manage" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.695681 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a7f1297-73c8-4b59-99c9-386d4b5483a1" containerName="nova-manage" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.695859 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a7f1297-73c8-4b59-99c9-386d4b5483a1" containerName="nova-manage" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.695878 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83a4721-02e6-4b61-a6bd-bfab8a756e2d" containerName="nova-api-api" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.695892 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a873296-2fb6-42f4-b88b-30a8292bc14e" containerName="dnsmasq-dns" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.695905 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83a4721-02e6-4b61-a6bd-bfab8a756e2d" containerName="nova-api-log" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.696995 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.697738 4885 scope.go:117] "RemoveContainer" containerID="2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764" Dec 05 20:27:12 crc kubenswrapper[4885]: E1205 20:27:12.698392 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764\": container with ID starting with 2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764 not found: ID does not exist" containerID="2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.698434 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764"} err="failed to get container status \"2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764\": rpc error: code = NotFound desc = could not find container \"2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764\": container with ID starting with 2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764 not found: ID does not exist" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.698461 4885 scope.go:117] "RemoveContainer" containerID="ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64" Dec 05 20:27:12 crc kubenswrapper[4885]: E1205 20:27:12.699138 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64\": container with ID starting with ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64 not found: ID does not exist" containerID="ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.699209 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64"} err="failed to get container status \"ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64\": rpc error: code = NotFound desc = could not find container \"ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64\": container with ID starting with ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64 not found: ID does not exist" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.699241 4885 scope.go:117] "RemoveContainer" containerID="2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.699242 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.700703 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.701430 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.703190 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764"} err="failed to get container status \"2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764\": rpc error: code = NotFound desc = could not find container \"2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764\": container with ID starting with 2ba042d94a9679b880d2a30ee8350d9bc939856e3c25661aad3ddfa860a17764 not found: ID does not exist" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.703261 4885 scope.go:117] "RemoveContainer" containerID="ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.703709 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64"} err="failed to get container status \"ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64\": rpc error: code = NotFound desc = could not find container \"ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64\": container with ID starting with ec2eb7d65e6b330988e72b71bd1ef8ef663ae4447ebf0f29156ca45b136cfa64 not found: ID does not exist" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.730599 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.772440 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e275487-025f-4c31-a7f4-267b05218da9-public-tls-certs\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.772829 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e275487-025f-4c31-a7f4-267b05218da9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.773140 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e275487-025f-4c31-a7f4-267b05218da9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.773327 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e275487-025f-4c31-a7f4-267b05218da9-config-data\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.773523 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdscc\" (UniqueName: \"kubernetes.io/projected/1e275487-025f-4c31-a7f4-267b05218da9-kube-api-access-jdscc\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.773674 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e275487-025f-4c31-a7f4-267b05218da9-logs\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.875838 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e275487-025f-4c31-a7f4-267b05218da9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.876233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e275487-025f-4c31-a7f4-267b05218da9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.876350 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e275487-025f-4c31-a7f4-267b05218da9-config-data\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.876489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdscc\" (UniqueName: \"kubernetes.io/projected/1e275487-025f-4c31-a7f4-267b05218da9-kube-api-access-jdscc\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.876611 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e275487-025f-4c31-a7f4-267b05218da9-logs\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.876795 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e275487-025f-4c31-a7f4-267b05218da9-public-tls-certs\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.877187 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e275487-025f-4c31-a7f4-267b05218da9-logs\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.881116 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e275487-025f-4c31-a7f4-267b05218da9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.881745 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e275487-025f-4c31-a7f4-267b05218da9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.882601 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e275487-025f-4c31-a7f4-267b05218da9-config-data\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.883499 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e275487-025f-4c31-a7f4-267b05218da9-public-tls-certs\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:12 crc kubenswrapper[4885]: I1205 20:27:12.893741 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdscc\" (UniqueName: \"kubernetes.io/projected/1e275487-025f-4c31-a7f4-267b05218da9-kube-api-access-jdscc\") pod \"nova-api-0\" (UID: \"1e275487-025f-4c31-a7f4-267b05218da9\") " pod="openstack/nova-api-0" Dec 05 20:27:13 crc kubenswrapper[4885]: I1205 20:27:13.021250 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 20:27:13 crc kubenswrapper[4885]: I1205 20:27:13.188197 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e83a4721-02e6-4b61-a6bd-bfab8a756e2d" path="/var/lib/kubelet/pods/e83a4721-02e6-4b61-a6bd-bfab8a756e2d/volumes" Dec 05 20:27:13 crc kubenswrapper[4885]: I1205 20:27:13.447501 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 20:27:13 crc kubenswrapper[4885]: W1205 20:27:13.456551 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e275487_025f_4c31_a7f4_267b05218da9.slice/crio-5cd665e79972de8ab3d0d0e10dd99b1a9ec9a2d8665aa200c08cb15b6c63ccee WatchSource:0}: Error finding container 5cd665e79972de8ab3d0d0e10dd99b1a9ec9a2d8665aa200c08cb15b6c63ccee: Status 404 returned error can't find the container with id 5cd665e79972de8ab3d0d0e10dd99b1a9ec9a2d8665aa200c08cb15b6c63ccee Dec 05 20:27:13 crc kubenswrapper[4885]: E1205 20:27:13.460933 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ef80d3f7dab56a228bb64695c7c7c618c891820b73c90c741faf5395897ff26" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 20:27:13 crc kubenswrapper[4885]: E1205 20:27:13.463108 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ef80d3f7dab56a228bb64695c7c7c618c891820b73c90c741faf5395897ff26" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 20:27:13 crc kubenswrapper[4885]: E1205 20:27:13.466784 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ef80d3f7dab56a228bb64695c7c7c618c891820b73c90c741faf5395897ff26" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 20:27:13 crc kubenswrapper[4885]: E1205 20:27:13.466828 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0885bbfa-d44b-4e51-948b-8089bbb49c7b" containerName="nova-scheduler-scheduler" Dec 05 20:27:13 crc kubenswrapper[4885]: I1205 20:27:13.646073 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e275487-025f-4c31-a7f4-267b05218da9","Type":"ContainerStarted","Data":"5cd665e79972de8ab3d0d0e10dd99b1a9ec9a2d8665aa200c08cb15b6c63ccee"} Dec 05 20:27:14 crc kubenswrapper[4885]: I1205 20:27:14.663095 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e275487-025f-4c31-a7f4-267b05218da9","Type":"ContainerStarted","Data":"d3b4aaffea05351f99329ef5bb7287a8ded6e730a458d3689a92a5393c21eb80"} Dec 05 20:27:14 crc kubenswrapper[4885]: I1205 20:27:14.666016 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e275487-025f-4c31-a7f4-267b05218da9","Type":"ContainerStarted","Data":"0edf5e3e693bfbb4edadbb66575bd7f8b19aee8b6deca3eda1303242543e0de9"} Dec 05 20:27:14 crc kubenswrapper[4885]: I1205 20:27:14.691120 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.691101978 podStartE2EDuration="2.691101978s" podCreationTimestamp="2025-12-05 20:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:27:14.683959056 +0000 UTC m=+1299.980774757" watchObservedRunningTime="2025-12-05 20:27:14.691101978 +0000 UTC m=+1299.987917639" Dec 05 20:27:14 crc kubenswrapper[4885]: I1205 20:27:14.941306 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="20be5ba9-3fcb-446d-bec3-eaf96556d805" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:41050->10.217.0.191:8775: read: connection reset by peer" Dec 05 20:27:14 crc kubenswrapper[4885]: I1205 20:27:14.941323 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="20be5ba9-3fcb-446d-bec3-eaf96556d805" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:41044->10.217.0.191:8775: read: connection reset by peer" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.325045 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.424224 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-nova-metadata-tls-certs\") pod \"20be5ba9-3fcb-446d-bec3-eaf96556d805\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.424300 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20be5ba9-3fcb-446d-bec3-eaf96556d805-logs\") pod \"20be5ba9-3fcb-446d-bec3-eaf96556d805\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.424402 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-combined-ca-bundle\") pod \"20be5ba9-3fcb-446d-bec3-eaf96556d805\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.424504 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbgjc\" (UniqueName: \"kubernetes.io/projected/20be5ba9-3fcb-446d-bec3-eaf96556d805-kube-api-access-xbgjc\") pod \"20be5ba9-3fcb-446d-bec3-eaf96556d805\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.424550 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-config-data\") pod \"20be5ba9-3fcb-446d-bec3-eaf96556d805\" (UID: \"20be5ba9-3fcb-446d-bec3-eaf96556d805\") " Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.425483 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20be5ba9-3fcb-446d-bec3-eaf96556d805-logs" (OuterVolumeSpecName: "logs") pod "20be5ba9-3fcb-446d-bec3-eaf96556d805" (UID: "20be5ba9-3fcb-446d-bec3-eaf96556d805"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.433886 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20be5ba9-3fcb-446d-bec3-eaf96556d805-kube-api-access-xbgjc" (OuterVolumeSpecName: "kube-api-access-xbgjc") pod "20be5ba9-3fcb-446d-bec3-eaf96556d805" (UID: "20be5ba9-3fcb-446d-bec3-eaf96556d805"). InnerVolumeSpecName "kube-api-access-xbgjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.490531 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20be5ba9-3fcb-446d-bec3-eaf96556d805" (UID: "20be5ba9-3fcb-446d-bec3-eaf96556d805"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.508492 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-config-data" (OuterVolumeSpecName: "config-data") pod "20be5ba9-3fcb-446d-bec3-eaf96556d805" (UID: "20be5ba9-3fcb-446d-bec3-eaf96556d805"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.516932 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "20be5ba9-3fcb-446d-bec3-eaf96556d805" (UID: "20be5ba9-3fcb-446d-bec3-eaf96556d805"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.526481 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20be5ba9-3fcb-446d-bec3-eaf96556d805-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.526513 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.526523 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbgjc\" (UniqueName: \"kubernetes.io/projected/20be5ba9-3fcb-446d-bec3-eaf96556d805-kube-api-access-xbgjc\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.526532 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.526541 4885 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20be5ba9-3fcb-446d-bec3-eaf96556d805-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.675370 4885 generic.go:334] "Generic (PLEG): container finished" podID="20be5ba9-3fcb-446d-bec3-eaf96556d805" containerID="fd485a8695d5a55319bbcd91e05f32caed1f175a66d999a7ab4dca5e6d0552f4" exitCode=0 Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.675455 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.675507 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20be5ba9-3fcb-446d-bec3-eaf96556d805","Type":"ContainerDied","Data":"fd485a8695d5a55319bbcd91e05f32caed1f175a66d999a7ab4dca5e6d0552f4"} Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.675552 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20be5ba9-3fcb-446d-bec3-eaf96556d805","Type":"ContainerDied","Data":"68f4e04e2440b14cd97a5d011069faef75d2616b41148c472ae1951794dd2b46"} Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.675581 4885 scope.go:117] "RemoveContainer" containerID="fd485a8695d5a55319bbcd91e05f32caed1f175a66d999a7ab4dca5e6d0552f4" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.697229 4885 scope.go:117] "RemoveContainer" containerID="033add53f2a15716fe27fb06e12fdbb503d21d3dbe93e1ab0443a11dbe23c319" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.717053 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.721272 4885 scope.go:117] "RemoveContainer" containerID="fd485a8695d5a55319bbcd91e05f32caed1f175a66d999a7ab4dca5e6d0552f4" Dec 05 20:27:15 crc kubenswrapper[4885]: E1205 20:27:15.722459 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd485a8695d5a55319bbcd91e05f32caed1f175a66d999a7ab4dca5e6d0552f4\": container with ID starting with fd485a8695d5a55319bbcd91e05f32caed1f175a66d999a7ab4dca5e6d0552f4 not found: ID does not exist" containerID="fd485a8695d5a55319bbcd91e05f32caed1f175a66d999a7ab4dca5e6d0552f4" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.722504 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd485a8695d5a55319bbcd91e05f32caed1f175a66d999a7ab4dca5e6d0552f4"} err="failed to get container status \"fd485a8695d5a55319bbcd91e05f32caed1f175a66d999a7ab4dca5e6d0552f4\": rpc error: code = NotFound desc = could not find container \"fd485a8695d5a55319bbcd91e05f32caed1f175a66d999a7ab4dca5e6d0552f4\": container with ID starting with fd485a8695d5a55319bbcd91e05f32caed1f175a66d999a7ab4dca5e6d0552f4 not found: ID does not exist" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.722529 4885 scope.go:117] "RemoveContainer" containerID="033add53f2a15716fe27fb06e12fdbb503d21d3dbe93e1ab0443a11dbe23c319" Dec 05 20:27:15 crc kubenswrapper[4885]: E1205 20:27:15.722945 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"033add53f2a15716fe27fb06e12fdbb503d21d3dbe93e1ab0443a11dbe23c319\": container with ID starting with 033add53f2a15716fe27fb06e12fdbb503d21d3dbe93e1ab0443a11dbe23c319 not found: ID does not exist" containerID="033add53f2a15716fe27fb06e12fdbb503d21d3dbe93e1ab0443a11dbe23c319" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.722979 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033add53f2a15716fe27fb06e12fdbb503d21d3dbe93e1ab0443a11dbe23c319"} err="failed to get container status \"033add53f2a15716fe27fb06e12fdbb503d21d3dbe93e1ab0443a11dbe23c319\": rpc error: code = NotFound desc = could not find container \"033add53f2a15716fe27fb06e12fdbb503d21d3dbe93e1ab0443a11dbe23c319\": container with ID starting with 033add53f2a15716fe27fb06e12fdbb503d21d3dbe93e1ab0443a11dbe23c319 not found: ID does not exist" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.724882 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.748339 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:27:15 crc kubenswrapper[4885]: E1205 20:27:15.748728 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20be5ba9-3fcb-446d-bec3-eaf96556d805" containerName="nova-metadata-metadata" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.748748 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="20be5ba9-3fcb-446d-bec3-eaf96556d805" containerName="nova-metadata-metadata" Dec 05 20:27:15 crc kubenswrapper[4885]: E1205 20:27:15.748775 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20be5ba9-3fcb-446d-bec3-eaf96556d805" containerName="nova-metadata-log" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.748781 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="20be5ba9-3fcb-446d-bec3-eaf96556d805" containerName="nova-metadata-log" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.748951 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="20be5ba9-3fcb-446d-bec3-eaf96556d805" containerName="nova-metadata-log" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.748970 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="20be5ba9-3fcb-446d-bec3-eaf96556d805" containerName="nova-metadata-metadata" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.749925 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.751992 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.752884 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.770540 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.831595 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41069d3f-c9d5-4278-8171-cebf5434937e-config-data\") pod \"nova-metadata-0\" (UID: \"41069d3f-c9d5-4278-8171-cebf5434937e\") " pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.832099 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/41069d3f-c9d5-4278-8171-cebf5434937e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"41069d3f-c9d5-4278-8171-cebf5434937e\") " pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.832212 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvc4s\" (UniqueName: \"kubernetes.io/projected/41069d3f-c9d5-4278-8171-cebf5434937e-kube-api-access-mvc4s\") pod \"nova-metadata-0\" (UID: \"41069d3f-c9d5-4278-8171-cebf5434937e\") " pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.832292 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41069d3f-c9d5-4278-8171-cebf5434937e-logs\") pod \"nova-metadata-0\" (UID: \"41069d3f-c9d5-4278-8171-cebf5434937e\") " pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.832417 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41069d3f-c9d5-4278-8171-cebf5434937e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"41069d3f-c9d5-4278-8171-cebf5434937e\") " pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.934452 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvc4s\" (UniqueName: \"kubernetes.io/projected/41069d3f-c9d5-4278-8171-cebf5434937e-kube-api-access-mvc4s\") pod \"nova-metadata-0\" (UID: \"41069d3f-c9d5-4278-8171-cebf5434937e\") " pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.934526 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41069d3f-c9d5-4278-8171-cebf5434937e-logs\") pod \"nova-metadata-0\" (UID: \"41069d3f-c9d5-4278-8171-cebf5434937e\") " pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.934658 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41069d3f-c9d5-4278-8171-cebf5434937e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"41069d3f-c9d5-4278-8171-cebf5434937e\") " pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.934768 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41069d3f-c9d5-4278-8171-cebf5434937e-config-data\") pod \"nova-metadata-0\" (UID: \"41069d3f-c9d5-4278-8171-cebf5434937e\") " pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.934873 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/41069d3f-c9d5-4278-8171-cebf5434937e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"41069d3f-c9d5-4278-8171-cebf5434937e\") " pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.936130 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41069d3f-c9d5-4278-8171-cebf5434937e-logs\") pod \"nova-metadata-0\" (UID: \"41069d3f-c9d5-4278-8171-cebf5434937e\") " pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.939656 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/41069d3f-c9d5-4278-8171-cebf5434937e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"41069d3f-c9d5-4278-8171-cebf5434937e\") " pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.939823 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41069d3f-c9d5-4278-8171-cebf5434937e-config-data\") pod \"nova-metadata-0\" (UID: \"41069d3f-c9d5-4278-8171-cebf5434937e\") " pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.940238 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41069d3f-c9d5-4278-8171-cebf5434937e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"41069d3f-c9d5-4278-8171-cebf5434937e\") " pod="openstack/nova-metadata-0" Dec 05 20:27:15 crc kubenswrapper[4885]: I1205 20:27:15.952689 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvc4s\" (UniqueName: \"kubernetes.io/projected/41069d3f-c9d5-4278-8171-cebf5434937e-kube-api-access-mvc4s\") pod \"nova-metadata-0\" (UID: \"41069d3f-c9d5-4278-8171-cebf5434937e\") " pod="openstack/nova-metadata-0" Dec 05 20:27:16 crc kubenswrapper[4885]: I1205 20:27:16.070792 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 20:27:16 crc kubenswrapper[4885]: I1205 20:27:16.548796 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 20:27:16 crc kubenswrapper[4885]: I1205 20:27:16.631395 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:27:16 crc kubenswrapper[4885]: I1205 20:27:16.631463 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:27:16 crc kubenswrapper[4885]: I1205 20:27:16.687946 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"41069d3f-c9d5-4278-8171-cebf5434937e","Type":"ContainerStarted","Data":"95fd8842a176f7632a011c729587ffb53e1ade2e2a8bd4bce767021292c307dc"} Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.188253 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20be5ba9-3fcb-446d-bec3-eaf96556d805" path="/var/lib/kubelet/pods/20be5ba9-3fcb-446d-bec3-eaf96556d805/volumes" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.448716 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.564110 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0885bbfa-d44b-4e51-948b-8089bbb49c7b-config-data\") pod \"0885bbfa-d44b-4e51-948b-8089bbb49c7b\" (UID: \"0885bbfa-d44b-4e51-948b-8089bbb49c7b\") " Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.564191 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0885bbfa-d44b-4e51-948b-8089bbb49c7b-combined-ca-bundle\") pod \"0885bbfa-d44b-4e51-948b-8089bbb49c7b\" (UID: \"0885bbfa-d44b-4e51-948b-8089bbb49c7b\") " Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.564287 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km2bp\" (UniqueName: \"kubernetes.io/projected/0885bbfa-d44b-4e51-948b-8089bbb49c7b-kube-api-access-km2bp\") pod \"0885bbfa-d44b-4e51-948b-8089bbb49c7b\" (UID: \"0885bbfa-d44b-4e51-948b-8089bbb49c7b\") " Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.570121 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0885bbfa-d44b-4e51-948b-8089bbb49c7b-kube-api-access-km2bp" (OuterVolumeSpecName: "kube-api-access-km2bp") pod "0885bbfa-d44b-4e51-948b-8089bbb49c7b" (UID: "0885bbfa-d44b-4e51-948b-8089bbb49c7b"). InnerVolumeSpecName "kube-api-access-km2bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.597998 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0885bbfa-d44b-4e51-948b-8089bbb49c7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0885bbfa-d44b-4e51-948b-8089bbb49c7b" (UID: "0885bbfa-d44b-4e51-948b-8089bbb49c7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.600810 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0885bbfa-d44b-4e51-948b-8089bbb49c7b-config-data" (OuterVolumeSpecName: "config-data") pod "0885bbfa-d44b-4e51-948b-8089bbb49c7b" (UID: "0885bbfa-d44b-4e51-948b-8089bbb49c7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.666426 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km2bp\" (UniqueName: \"kubernetes.io/projected/0885bbfa-d44b-4e51-948b-8089bbb49c7b-kube-api-access-km2bp\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.666466 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0885bbfa-d44b-4e51-948b-8089bbb49c7b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.666478 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0885bbfa-d44b-4e51-948b-8089bbb49c7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.698127 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"41069d3f-c9d5-4278-8171-cebf5434937e","Type":"ContainerStarted","Data":"6a528d40c558fc218887429f31242493fe52eb39fea797795a5e3af47f6f2067"} Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.698170 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"41069d3f-c9d5-4278-8171-cebf5434937e","Type":"ContainerStarted","Data":"3129a3747347fa97ee8e12aa2e0459ca29f10ad42cf53454aae7e53742b19c57"} Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.699848 4885 generic.go:334] "Generic (PLEG): container finished" podID="0885bbfa-d44b-4e51-948b-8089bbb49c7b" containerID="4ef80d3f7dab56a228bb64695c7c7c618c891820b73c90c741faf5395897ff26" exitCode=0 Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.699872 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0885bbfa-d44b-4e51-948b-8089bbb49c7b","Type":"ContainerDied","Data":"4ef80d3f7dab56a228bb64695c7c7c618c891820b73c90c741faf5395897ff26"} Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.699892 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0885bbfa-d44b-4e51-948b-8089bbb49c7b","Type":"ContainerDied","Data":"226143449e5b63b9104a8df10645ce988ba9f7dbbbf17814d417f4110ef55942"} Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.699893 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.699919 4885 scope.go:117] "RemoveContainer" containerID="4ef80d3f7dab56a228bb64695c7c7c618c891820b73c90c741faf5395897ff26" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.733629 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.733609603 podStartE2EDuration="2.733609603s" podCreationTimestamp="2025-12-05 20:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:27:17.723926061 +0000 UTC m=+1303.020741762" watchObservedRunningTime="2025-12-05 20:27:17.733609603 +0000 UTC m=+1303.030425264" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.746502 4885 scope.go:117] "RemoveContainer" containerID="4ef80d3f7dab56a228bb64695c7c7c618c891820b73c90c741faf5395897ff26" Dec 05 20:27:17 crc kubenswrapper[4885]: E1205 20:27:17.746899 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ef80d3f7dab56a228bb64695c7c7c618c891820b73c90c741faf5395897ff26\": container with ID starting with 4ef80d3f7dab56a228bb64695c7c7c618c891820b73c90c741faf5395897ff26 not found: ID does not exist" containerID="4ef80d3f7dab56a228bb64695c7c7c618c891820b73c90c741faf5395897ff26" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.746943 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ef80d3f7dab56a228bb64695c7c7c618c891820b73c90c741faf5395897ff26"} err="failed to get container status \"4ef80d3f7dab56a228bb64695c7c7c618c891820b73c90c741faf5395897ff26\": rpc error: code = NotFound desc = could not find container \"4ef80d3f7dab56a228bb64695c7c7c618c891820b73c90c741faf5395897ff26\": container with ID starting with 4ef80d3f7dab56a228bb64695c7c7c618c891820b73c90c741faf5395897ff26 not found: ID does not exist" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.758986 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.791442 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.803277 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:27:17 crc kubenswrapper[4885]: E1205 20:27:17.803808 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0885bbfa-d44b-4e51-948b-8089bbb49c7b" containerName="nova-scheduler-scheduler" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.803838 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0885bbfa-d44b-4e51-948b-8089bbb49c7b" containerName="nova-scheduler-scheduler" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.804109 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0885bbfa-d44b-4e51-948b-8089bbb49c7b" containerName="nova-scheduler-scheduler" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.804803 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.807909 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.812251 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.878337 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh49z\" (UniqueName: \"kubernetes.io/projected/0d49d4cd-955c-41c7-8df0-63b364cb3e2d-kube-api-access-wh49z\") pod \"nova-scheduler-0\" (UID: \"0d49d4cd-955c-41c7-8df0-63b364cb3e2d\") " pod="openstack/nova-scheduler-0" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.878695 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d49d4cd-955c-41c7-8df0-63b364cb3e2d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0d49d4cd-955c-41c7-8df0-63b364cb3e2d\") " pod="openstack/nova-scheduler-0" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.878875 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d49d4cd-955c-41c7-8df0-63b364cb3e2d-config-data\") pod \"nova-scheduler-0\" (UID: \"0d49d4cd-955c-41c7-8df0-63b364cb3e2d\") " pod="openstack/nova-scheduler-0" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.980525 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh49z\" (UniqueName: \"kubernetes.io/projected/0d49d4cd-955c-41c7-8df0-63b364cb3e2d-kube-api-access-wh49z\") pod \"nova-scheduler-0\" (UID: \"0d49d4cd-955c-41c7-8df0-63b364cb3e2d\") " pod="openstack/nova-scheduler-0" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.980631 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d49d4cd-955c-41c7-8df0-63b364cb3e2d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0d49d4cd-955c-41c7-8df0-63b364cb3e2d\") " pod="openstack/nova-scheduler-0" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.980691 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d49d4cd-955c-41c7-8df0-63b364cb3e2d-config-data\") pod \"nova-scheduler-0\" (UID: \"0d49d4cd-955c-41c7-8df0-63b364cb3e2d\") " pod="openstack/nova-scheduler-0" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.985610 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d49d4cd-955c-41c7-8df0-63b364cb3e2d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0d49d4cd-955c-41c7-8df0-63b364cb3e2d\") " pod="openstack/nova-scheduler-0" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.986958 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d49d4cd-955c-41c7-8df0-63b364cb3e2d-config-data\") pod \"nova-scheduler-0\" (UID: \"0d49d4cd-955c-41c7-8df0-63b364cb3e2d\") " pod="openstack/nova-scheduler-0" Dec 05 20:27:17 crc kubenswrapper[4885]: I1205 20:27:17.997521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh49z\" (UniqueName: \"kubernetes.io/projected/0d49d4cd-955c-41c7-8df0-63b364cb3e2d-kube-api-access-wh49z\") pod \"nova-scheduler-0\" (UID: \"0d49d4cd-955c-41c7-8df0-63b364cb3e2d\") " pod="openstack/nova-scheduler-0" Dec 05 20:27:18 crc kubenswrapper[4885]: I1205 20:27:18.125189 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 20:27:18 crc kubenswrapper[4885]: W1205 20:27:18.439536 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d49d4cd_955c_41c7_8df0_63b364cb3e2d.slice/crio-0fde4f25b75bfe3f0cff81bf2214568b6248bab676fa28ed87209125b0852ea0 WatchSource:0}: Error finding container 0fde4f25b75bfe3f0cff81bf2214568b6248bab676fa28ed87209125b0852ea0: Status 404 returned error can't find the container with id 0fde4f25b75bfe3f0cff81bf2214568b6248bab676fa28ed87209125b0852ea0 Dec 05 20:27:18 crc kubenswrapper[4885]: I1205 20:27:18.440680 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 20:27:18 crc kubenswrapper[4885]: I1205 20:27:18.713508 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0d49d4cd-955c-41c7-8df0-63b364cb3e2d","Type":"ContainerStarted","Data":"dc280024866facd9ebbb2f97d2c69568b69f05c5a32b6940da7a7d937f454927"} Dec 05 20:27:18 crc kubenswrapper[4885]: I1205 20:27:18.713860 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0d49d4cd-955c-41c7-8df0-63b364cb3e2d","Type":"ContainerStarted","Data":"0fde4f25b75bfe3f0cff81bf2214568b6248bab676fa28ed87209125b0852ea0"} Dec 05 20:27:18 crc kubenswrapper[4885]: I1205 20:27:18.728455 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.728429408 podStartE2EDuration="1.728429408s" podCreationTimestamp="2025-12-05 20:27:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:27:18.727562122 +0000 UTC m=+1304.024377773" watchObservedRunningTime="2025-12-05 20:27:18.728429408 +0000 UTC m=+1304.025245109" Dec 05 20:27:19 crc kubenswrapper[4885]: I1205 20:27:19.189571 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0885bbfa-d44b-4e51-948b-8089bbb49c7b" path="/var/lib/kubelet/pods/0885bbfa-d44b-4e51-948b-8089bbb49c7b/volumes" Dec 05 20:27:21 crc kubenswrapper[4885]: I1205 20:27:21.070924 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 20:27:21 crc kubenswrapper[4885]: I1205 20:27:21.071317 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 20:27:23 crc kubenswrapper[4885]: I1205 20:27:23.021645 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 20:27:23 crc kubenswrapper[4885]: I1205 20:27:23.022266 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 20:27:23 crc kubenswrapper[4885]: I1205 20:27:23.125783 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 20:27:24 crc kubenswrapper[4885]: I1205 20:27:24.035206 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1e275487-025f-4c31-a7f4-267b05218da9" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 20:27:24 crc kubenswrapper[4885]: I1205 20:27:24.035206 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1e275487-025f-4c31-a7f4-267b05218da9" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 20:27:26 crc kubenswrapper[4885]: I1205 20:27:26.071681 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 20:27:26 crc kubenswrapper[4885]: I1205 20:27:26.072047 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 20:27:27 crc kubenswrapper[4885]: I1205 20:27:27.084183 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="41069d3f-c9d5-4278-8171-cebf5434937e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 20:27:27 crc kubenswrapper[4885]: I1205 20:27:27.084183 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="41069d3f-c9d5-4278-8171-cebf5434937e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 20:27:28 crc kubenswrapper[4885]: I1205 20:27:28.126070 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 20:27:28 crc kubenswrapper[4885]: I1205 20:27:28.162716 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 20:27:28 crc kubenswrapper[4885]: I1205 20:27:28.864645 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 20:27:29 crc kubenswrapper[4885]: I1205 20:27:29.829821 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 20:27:33 crc kubenswrapper[4885]: I1205 20:27:33.029006 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 20:27:33 crc kubenswrapper[4885]: I1205 20:27:33.029755 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 20:27:33 crc kubenswrapper[4885]: I1205 20:27:33.032042 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 20:27:33 crc kubenswrapper[4885]: I1205 20:27:33.038271 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 20:27:33 crc kubenswrapper[4885]: I1205 20:27:33.871896 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 20:27:33 crc kubenswrapper[4885]: I1205 20:27:33.878070 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 20:27:36 crc kubenswrapper[4885]: I1205 20:27:36.077486 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 20:27:36 crc kubenswrapper[4885]: I1205 20:27:36.078770 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 20:27:36 crc kubenswrapper[4885]: I1205 20:27:36.088503 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 20:27:36 crc kubenswrapper[4885]: I1205 20:27:36.937379 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 20:27:45 crc kubenswrapper[4885]: I1205 20:27:45.244186 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:27:46 crc kubenswrapper[4885]: I1205 20:27:46.086144 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:27:46 crc kubenswrapper[4885]: I1205 20:27:46.630769 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:27:46 crc kubenswrapper[4885]: I1205 20:27:46.630839 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:27:49 crc kubenswrapper[4885]: I1205 20:27:49.341783 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" containerName="rabbitmq" containerID="cri-o://964ed81a92ab2e00f935f8233cdd96fe1198bb0f098bef4e1e7daf63fe3fafa1" gracePeriod=604796 Dec 05 20:27:49 crc kubenswrapper[4885]: I1205 20:27:49.996321 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" containerName="rabbitmq" containerID="cri-o://f778b64021c2d0e5e586038ad9bde7b16c2ba89dcabd1bd490cedc2a212d1f2c" gracePeriod=604797 Dec 05 20:27:53 crc kubenswrapper[4885]: I1205 20:27:53.587308 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 05 20:27:53 crc kubenswrapper[4885]: I1205 20:27:53.877412 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 05 20:27:55 crc kubenswrapper[4885]: I1205 20:27:55.972838 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.078424 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-confd\") pod \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.078530 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-plugins-conf\") pod \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.078597 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-config-data\") pod \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.078707 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.078740 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-erlang-cookie-secret\") pod \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.078775 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-server-conf\") pod \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.078857 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-tls\") pod \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.078904 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-plugins\") pod \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.078943 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-erlang-cookie\") pod \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.078974 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qsbc\" (UniqueName: \"kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-kube-api-access-6qsbc\") pod \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.079001 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-pod-info\") pod \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\" (UID: \"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.081724 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" (UID: "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.083390 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" (UID: "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.085545 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" (UID: "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.098910 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-kube-api-access-6qsbc" (OuterVolumeSpecName: "kube-api-access-6qsbc") pod "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" (UID: "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee"). InnerVolumeSpecName "kube-api-access-6qsbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.126487 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" (UID: "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.126499 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" (UID: "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.126499 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-pod-info" (OuterVolumeSpecName: "pod-info") pod "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" (UID: "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.128009 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" (UID: "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.129770 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-config-data" (OuterVolumeSpecName: "config-data") pod "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" (UID: "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.143385 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" containerID="964ed81a92ab2e00f935f8233cdd96fe1198bb0f098bef4e1e7daf63fe3fafa1" exitCode=0 Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.143447 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee","Type":"ContainerDied","Data":"964ed81a92ab2e00f935f8233cdd96fe1198bb0f098bef4e1e7daf63fe3fafa1"} Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.143472 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee","Type":"ContainerDied","Data":"cd892dc43a46ad8270548b30d8e8d28ac09267d52acf697d8b254d0c18e27f61"} Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.143488 4885 scope.go:117] "RemoveContainer" containerID="964ed81a92ab2e00f935f8233cdd96fe1198bb0f098bef4e1e7daf63fe3fafa1" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.143710 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.153777 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-server-conf" (OuterVolumeSpecName: "server-conf") pod "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" (UID: "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.182543 4885 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.182580 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.182608 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.182620 4885 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.182631 4885 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.182641 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.182652 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.182665 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.182675 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qsbc\" (UniqueName: \"kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-kube-api-access-6qsbc\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.182684 4885 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.212394 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.260216 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" (UID: "2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.265597 4885 scope.go:117] "RemoveContainer" containerID="92854c07f11776cd9ac2f61cab36a7565089c9ec58d08137d9efcb0447965bfe" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.284330 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.284378 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.293223 4885 scope.go:117] "RemoveContainer" containerID="964ed81a92ab2e00f935f8233cdd96fe1198bb0f098bef4e1e7daf63fe3fafa1" Dec 05 20:27:56 crc kubenswrapper[4885]: E1205 20:27:56.293652 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"964ed81a92ab2e00f935f8233cdd96fe1198bb0f098bef4e1e7daf63fe3fafa1\": container with ID starting with 964ed81a92ab2e00f935f8233cdd96fe1198bb0f098bef4e1e7daf63fe3fafa1 not found: ID does not exist" containerID="964ed81a92ab2e00f935f8233cdd96fe1198bb0f098bef4e1e7daf63fe3fafa1" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.293692 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"964ed81a92ab2e00f935f8233cdd96fe1198bb0f098bef4e1e7daf63fe3fafa1"} err="failed to get container status \"964ed81a92ab2e00f935f8233cdd96fe1198bb0f098bef4e1e7daf63fe3fafa1\": rpc error: code = NotFound desc = could not find container \"964ed81a92ab2e00f935f8233cdd96fe1198bb0f098bef4e1e7daf63fe3fafa1\": container with ID starting with 964ed81a92ab2e00f935f8233cdd96fe1198bb0f098bef4e1e7daf63fe3fafa1 not found: ID does not exist" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.293719 4885 scope.go:117] "RemoveContainer" containerID="92854c07f11776cd9ac2f61cab36a7565089c9ec58d08137d9efcb0447965bfe" Dec 05 20:27:56 crc kubenswrapper[4885]: E1205 20:27:56.294118 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92854c07f11776cd9ac2f61cab36a7565089c9ec58d08137d9efcb0447965bfe\": container with ID starting with 92854c07f11776cd9ac2f61cab36a7565089c9ec58d08137d9efcb0447965bfe not found: ID does not exist" containerID="92854c07f11776cd9ac2f61cab36a7565089c9ec58d08137d9efcb0447965bfe" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.294149 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92854c07f11776cd9ac2f61cab36a7565089c9ec58d08137d9efcb0447965bfe"} err="failed to get container status \"92854c07f11776cd9ac2f61cab36a7565089c9ec58d08137d9efcb0447965bfe\": rpc error: code = NotFound desc = could not find container \"92854c07f11776cd9ac2f61cab36a7565089c9ec58d08137d9efcb0447965bfe\": container with ID starting with 92854c07f11776cd9ac2f61cab36a7565089c9ec58d08137d9efcb0447965bfe not found: ID does not exist" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.490912 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.501838 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.508205 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.524464 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:27:56 crc kubenswrapper[4885]: E1205 20:27:56.524927 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" containerName="setup-container" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.524951 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" containerName="setup-container" Dec 05 20:27:56 crc kubenswrapper[4885]: E1205 20:27:56.524966 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" containerName="rabbitmq" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.524974 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" containerName="rabbitmq" Dec 05 20:27:56 crc kubenswrapper[4885]: E1205 20:27:56.524994 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" containerName="setup-container" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.525001 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" containerName="setup-container" Dec 05 20:27:56 crc kubenswrapper[4885]: E1205 20:27:56.525050 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" containerName="rabbitmq" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.525064 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" containerName="rabbitmq" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.525284 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" containerName="rabbitmq" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.525301 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" containerName="rabbitmq" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.526534 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.530363 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.530455 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.530608 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.530639 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.530688 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-w8rlv" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.530753 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.530795 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.555299 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.599311 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-confd\") pod \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.599377 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fczd7\" (UniqueName: \"kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-kube-api-access-fczd7\") pod \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.599439 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-erlang-cookie\") pod \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.599492 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-plugins-conf\") pod \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.599511 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.599538 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-pod-info\") pod \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.599578 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-server-conf\") pod \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.599661 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-erlang-cookie-secret\") pod \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.599689 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-tls\") pod \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.599723 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-config-data\") pod \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.599747 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-plugins\") pod \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\" (UID: \"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68\") " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.600168 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cdc87c63-a124-485c-8f34-016d17a58f29-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.600229 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cdc87c63-a124-485c-8f34-016d17a58f29-config-data\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.600256 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cdc87c63-a124-485c-8f34-016d17a58f29-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.600281 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cdc87c63-a124-485c-8f34-016d17a58f29-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.600318 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.600350 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cdc87c63-a124-485c-8f34-016d17a58f29-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.600383 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cdc87c63-a124-485c-8f34-016d17a58f29-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.600405 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cdc87c63-a124-485c-8f34-016d17a58f29-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.600422 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf5jl\" (UniqueName: \"kubernetes.io/projected/cdc87c63-a124-485c-8f34-016d17a58f29-kube-api-access-lf5jl\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.600453 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cdc87c63-a124-485c-8f34-016d17a58f29-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.600467 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cdc87c63-a124-485c-8f34-016d17a58f29-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.601225 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" (UID: "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.601593 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" (UID: "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.602678 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" (UID: "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.608765 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-kube-api-access-fczd7" (OuterVolumeSpecName: "kube-api-access-fczd7") pod "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" (UID: "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68"). InnerVolumeSpecName "kube-api-access-fczd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.609930 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" (UID: "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.611130 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" (UID: "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.619458 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-pod-info" (OuterVolumeSpecName: "pod-info") pod "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" (UID: "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.621462 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" (UID: "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.636923 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-config-data" (OuterVolumeSpecName: "config-data") pod "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" (UID: "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.675677 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-server-conf" (OuterVolumeSpecName: "server-conf") pod "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" (UID: "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.703109 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cdc87c63-a124-485c-8f34-016d17a58f29-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.703917 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cdc87c63-a124-485c-8f34-016d17a58f29-config-data\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.704922 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cdc87c63-a124-485c-8f34-016d17a58f29-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.705290 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cdc87c63-a124-485c-8f34-016d17a58f29-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.705790 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.705975 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cdc87c63-a124-485c-8f34-016d17a58f29-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.706179 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cdc87c63-a124-485c-8f34-016d17a58f29-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.706281 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cdc87c63-a124-485c-8f34-016d17a58f29-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.706361 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf5jl\" (UniqueName: \"kubernetes.io/projected/cdc87c63-a124-485c-8f34-016d17a58f29-kube-api-access-lf5jl\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.706461 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cdc87c63-a124-485c-8f34-016d17a58f29-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.706527 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cdc87c63-a124-485c-8f34-016d17a58f29-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.706661 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.706747 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.706807 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.706870 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fczd7\" (UniqueName: \"kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-kube-api-access-fczd7\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.706936 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.706994 4885 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.707077 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.707224 4885 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.707289 4885 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.707349 4885 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.703810 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cdc87c63-a124-485c-8f34-016d17a58f29-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.706174 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.706935 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cdc87c63-a124-485c-8f34-016d17a58f29-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.707674 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cdc87c63-a124-485c-8f34-016d17a58f29-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.704870 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cdc87c63-a124-485c-8f34-016d17a58f29-config-data\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.705237 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cdc87c63-a124-485c-8f34-016d17a58f29-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.709956 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cdc87c63-a124-485c-8f34-016d17a58f29-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.728106 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cdc87c63-a124-485c-8f34-016d17a58f29-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.732563 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cdc87c63-a124-485c-8f34-016d17a58f29-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.732897 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cdc87c63-a124-485c-8f34-016d17a58f29-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.735939 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf5jl\" (UniqueName: \"kubernetes.io/projected/cdc87c63-a124-485c-8f34-016d17a58f29-kube-api-access-lf5jl\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.761244 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.762394 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" (UID: "a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.766306 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"cdc87c63-a124-485c-8f34-016d17a58f29\") " pod="openstack/rabbitmq-server-0" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.809525 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.809775 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:56 crc kubenswrapper[4885]: I1205 20:27:56.856180 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.160215 4885 generic.go:334] "Generic (PLEG): container finished" podID="a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" containerID="f778b64021c2d0e5e586038ad9bde7b16c2ba89dcabd1bd490cedc2a212d1f2c" exitCode=0 Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.160378 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68","Type":"ContainerDied","Data":"f778b64021c2d0e5e586038ad9bde7b16c2ba89dcabd1bd490cedc2a212d1f2c"} Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.160573 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68","Type":"ContainerDied","Data":"b833772c175681c8e67a2eb332f9c311bb97f55f73a4fd4d359ffe79e15279d5"} Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.160594 4885 scope.go:117] "RemoveContainer" containerID="f778b64021c2d0e5e586038ad9bde7b16c2ba89dcabd1bd490cedc2a212d1f2c" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.160453 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.185437 4885 scope.go:117] "RemoveContainer" containerID="cb7951a010b1fcef7bbdf48a7626b008b3385f640484db6e4f060850b13a3016" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.186008 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee" path="/var/lib/kubelet/pods/2a65b714-cb9c-4ce6-a5eb-5ebe8a7b2bee/volumes" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.202358 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.218591 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.226357 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.227963 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.233352 4885 scope.go:117] "RemoveContainer" containerID="f778b64021c2d0e5e586038ad9bde7b16c2ba89dcabd1bd490cedc2a212d1f2c" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.233726 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mkf59" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.233829 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.233952 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.234096 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.234106 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.234216 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.234226 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 20:27:57 crc kubenswrapper[4885]: E1205 20:27:57.243072 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f778b64021c2d0e5e586038ad9bde7b16c2ba89dcabd1bd490cedc2a212d1f2c\": container with ID starting with f778b64021c2d0e5e586038ad9bde7b16c2ba89dcabd1bd490cedc2a212d1f2c not found: ID does not exist" containerID="f778b64021c2d0e5e586038ad9bde7b16c2ba89dcabd1bd490cedc2a212d1f2c" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.243113 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f778b64021c2d0e5e586038ad9bde7b16c2ba89dcabd1bd490cedc2a212d1f2c"} err="failed to get container status \"f778b64021c2d0e5e586038ad9bde7b16c2ba89dcabd1bd490cedc2a212d1f2c\": rpc error: code = NotFound desc = could not find container \"f778b64021c2d0e5e586038ad9bde7b16c2ba89dcabd1bd490cedc2a212d1f2c\": container with ID starting with f778b64021c2d0e5e586038ad9bde7b16c2ba89dcabd1bd490cedc2a212d1f2c not found: ID does not exist" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.243137 4885 scope.go:117] "RemoveContainer" containerID="cb7951a010b1fcef7bbdf48a7626b008b3385f640484db6e4f060850b13a3016" Dec 05 20:27:57 crc kubenswrapper[4885]: E1205 20:27:57.244910 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb7951a010b1fcef7bbdf48a7626b008b3385f640484db6e4f060850b13a3016\": container with ID starting with cb7951a010b1fcef7bbdf48a7626b008b3385f640484db6e4f060850b13a3016 not found: ID does not exist" containerID="cb7951a010b1fcef7bbdf48a7626b008b3385f640484db6e4f060850b13a3016" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.244937 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7951a010b1fcef7bbdf48a7626b008b3385f640484db6e4f060850b13a3016"} err="failed to get container status \"cb7951a010b1fcef7bbdf48a7626b008b3385f640484db6e4f060850b13a3016\": rpc error: code = NotFound desc = could not find container \"cb7951a010b1fcef7bbdf48a7626b008b3385f640484db6e4f060850b13a3016\": container with ID starting with cb7951a010b1fcef7bbdf48a7626b008b3385f640484db6e4f060850b13a3016 not found: ID does not exist" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.252136 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.318776 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.318832 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.318850 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.318889 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.318924 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.318941 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.318978 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psb2v\" (UniqueName: \"kubernetes.io/projected/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-kube-api-access-psb2v\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.318994 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.319070 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.319086 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.319126 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.326244 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.420562 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.420599 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.420718 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.420822 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.420880 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.420901 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.420955 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.420988 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.421045 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.421073 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psb2v\" (UniqueName: \"kubernetes.io/projected/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-kube-api-access-psb2v\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.421112 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.421320 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.421996 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.422730 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.422935 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.423014 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.424783 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.425135 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.425902 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.425931 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.426601 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.439366 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psb2v\" (UniqueName: \"kubernetes.io/projected/38cec51a-a7b6-420f-8efe-f21b3acf2f3f-kube-api-access-psb2v\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.458347 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38cec51a-a7b6-420f-8efe-f21b3acf2f3f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:57 crc kubenswrapper[4885]: I1205 20:27:57.553735 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:27:58 crc kubenswrapper[4885]: I1205 20:27:58.033497 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:27:58 crc kubenswrapper[4885]: W1205 20:27:58.036832 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38cec51a_a7b6_420f_8efe_f21b3acf2f3f.slice/crio-ef09a66424418c673e267b64ebb000780558a10f16cf5e386a7f80cb7053ece1 WatchSource:0}: Error finding container ef09a66424418c673e267b64ebb000780558a10f16cf5e386a7f80cb7053ece1: Status 404 returned error can't find the container with id ef09a66424418c673e267b64ebb000780558a10f16cf5e386a7f80cb7053ece1 Dec 05 20:27:58 crc kubenswrapper[4885]: I1205 20:27:58.169739 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cdc87c63-a124-485c-8f34-016d17a58f29","Type":"ContainerStarted","Data":"ff6546689a601d73cbef3e3b8ec736e7be03f82bdad10eaeecbb3beed0e02f77"} Dec 05 20:27:58 crc kubenswrapper[4885]: I1205 20:27:58.172889 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38cec51a-a7b6-420f-8efe-f21b3acf2f3f","Type":"ContainerStarted","Data":"ef09a66424418c673e267b64ebb000780558a10f16cf5e386a7f80cb7053ece1"} Dec 05 20:27:58 crc kubenswrapper[4885]: I1205 20:27:58.897692 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f7bb59fc-hs4wf"] Dec 05 20:27:58 crc kubenswrapper[4885]: I1205 20:27:58.899893 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:58 crc kubenswrapper[4885]: I1205 20:27:58.902885 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 05 20:27:58 crc kubenswrapper[4885]: I1205 20:27:58.910550 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f7bb59fc-hs4wf"] Dec 05 20:27:58 crc kubenswrapper[4885]: I1205 20:27:58.954132 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-openstack-edpm-ipam\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:58 crc kubenswrapper[4885]: I1205 20:27:58.954704 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-config\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:58 crc kubenswrapper[4885]: I1205 20:27:58.954827 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fxpm\" (UniqueName: \"kubernetes.io/projected/863a7b08-7f6a-41e6-986c-307957d54f22-kube-api-access-2fxpm\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:58 crc kubenswrapper[4885]: I1205 20:27:58.954951 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-dns-swift-storage-0\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:58 crc kubenswrapper[4885]: I1205 20:27:58.955174 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-ovsdbserver-sb\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:58 crc kubenswrapper[4885]: I1205 20:27:58.955311 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-ovsdbserver-nb\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:58 crc kubenswrapper[4885]: I1205 20:27:58.955536 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-dns-svc\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.057751 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-dns-svc\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.058066 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-openstack-edpm-ipam\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.058194 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-config\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.058285 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fxpm\" (UniqueName: \"kubernetes.io/projected/863a7b08-7f6a-41e6-986c-307957d54f22-kube-api-access-2fxpm\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.058379 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-dns-swift-storage-0\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.058519 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-ovsdbserver-sb\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.058619 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-ovsdbserver-nb\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.058704 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-dns-svc\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.059468 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-openstack-edpm-ipam\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.060226 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-config\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.060408 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-ovsdbserver-sb\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.060417 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-ovsdbserver-nb\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.061058 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-dns-swift-storage-0\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.076326 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fxpm\" (UniqueName: \"kubernetes.io/projected/863a7b08-7f6a-41e6-986c-307957d54f22-kube-api-access-2fxpm\") pod \"dnsmasq-dns-74f7bb59fc-hs4wf\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.182995 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68" path="/var/lib/kubelet/pods/a9ea268e-bdd3-4ff6-b04c-f15e8bc98d68/volumes" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.183717 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cdc87c63-a124-485c-8f34-016d17a58f29","Type":"ContainerStarted","Data":"38638227110c2c40241d0963574b65c374d3749739a65faca3f91e3ea15b21c2"} Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.223353 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:27:59 crc kubenswrapper[4885]: I1205 20:27:59.675901 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f7bb59fc-hs4wf"] Dec 05 20:28:00 crc kubenswrapper[4885]: I1205 20:28:00.195426 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38cec51a-a7b6-420f-8efe-f21b3acf2f3f","Type":"ContainerStarted","Data":"05ad1b319dc142620a79b0f4b33b49dfaeedae941116ff268d31806a9874e0cc"} Dec 05 20:28:00 crc kubenswrapper[4885]: I1205 20:28:00.200367 4885 generic.go:334] "Generic (PLEG): container finished" podID="863a7b08-7f6a-41e6-986c-307957d54f22" containerID="82328be3f96a3240124057501827af3761a5c278f0269f1e2ba7464e80b4cb42" exitCode=0 Dec 05 20:28:00 crc kubenswrapper[4885]: I1205 20:28:00.200533 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" event={"ID":"863a7b08-7f6a-41e6-986c-307957d54f22","Type":"ContainerDied","Data":"82328be3f96a3240124057501827af3761a5c278f0269f1e2ba7464e80b4cb42"} Dec 05 20:28:00 crc kubenswrapper[4885]: I1205 20:28:00.200568 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" event={"ID":"863a7b08-7f6a-41e6-986c-307957d54f22","Type":"ContainerStarted","Data":"427ce79fd980ebc9cf5d97325ad511b3721fcb64682011fdb2d5340612bde82e"} Dec 05 20:28:01 crc kubenswrapper[4885]: I1205 20:28:01.222491 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" event={"ID":"863a7b08-7f6a-41e6-986c-307957d54f22","Type":"ContainerStarted","Data":"11cded92537dc001eef1420511caf87408d7e1eeaa42605ff05687196ebdbe9a"} Dec 05 20:28:01 crc kubenswrapper[4885]: I1205 20:28:01.254041 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" podStartSLOduration=3.254000023 podStartE2EDuration="3.254000023s" podCreationTimestamp="2025-12-05 20:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:28:01.24814963 +0000 UTC m=+1346.544965331" watchObservedRunningTime="2025-12-05 20:28:01.254000023 +0000 UTC m=+1346.550815694" Dec 05 20:28:02 crc kubenswrapper[4885]: I1205 20:28:02.232795 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.224176 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.284254 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c8fb5597c-8bfq9"] Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.284543 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" podUID="d2a5f17e-ef46-4471-b9bc-26133ef3760c" containerName="dnsmasq-dns" containerID="cri-o://04738de63af09e27861520d526ed66607eb4fa86bf7e70cada00561f3dc9a3e7" gracePeriod=10 Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.444857 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78f49d79c7-7qk6g"] Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.447586 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.481405 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78f49d79c7-7qk6g"] Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.610833 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-openstack-edpm-ipam\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.610909 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-dns-swift-storage-0\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.610939 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-ovsdbserver-sb\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.611007 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-dns-svc\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.611105 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqdgd\" (UniqueName: \"kubernetes.io/projected/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-kube-api-access-lqdgd\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.611133 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-ovsdbserver-nb\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.611173 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-config\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.713982 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-openstack-edpm-ipam\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.714075 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-dns-swift-storage-0\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.714095 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-ovsdbserver-sb\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.714213 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-dns-svc\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.714330 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqdgd\" (UniqueName: \"kubernetes.io/projected/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-kube-api-access-lqdgd\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.714384 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-ovsdbserver-nb\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.714416 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-config\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.715068 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-dns-svc\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.715398 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-ovsdbserver-sb\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.715398 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-dns-swift-storage-0\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.716353 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-ovsdbserver-nb\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.716368 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-config\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.716389 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-openstack-edpm-ipam\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.745480 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqdgd\" (UniqueName: \"kubernetes.io/projected/2bb6d6a7-1ca1-4089-91e9-f8641f2f262e-kube-api-access-lqdgd\") pod \"dnsmasq-dns-78f49d79c7-7qk6g\" (UID: \"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e\") " pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.791675 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:09 crc kubenswrapper[4885]: I1205 20:28:09.915300 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.021629 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-ovsdbserver-sb\") pod \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.021754 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-config\") pod \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.021788 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-dns-swift-storage-0\") pod \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.021896 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-ovsdbserver-nb\") pod \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.021948 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gbwt\" (UniqueName: \"kubernetes.io/projected/d2a5f17e-ef46-4471-b9bc-26133ef3760c-kube-api-access-6gbwt\") pod \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.022006 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-dns-svc\") pod \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\" (UID: \"d2a5f17e-ef46-4471-b9bc-26133ef3760c\") " Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.028149 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2a5f17e-ef46-4471-b9bc-26133ef3760c-kube-api-access-6gbwt" (OuterVolumeSpecName: "kube-api-access-6gbwt") pod "d2a5f17e-ef46-4471-b9bc-26133ef3760c" (UID: "d2a5f17e-ef46-4471-b9bc-26133ef3760c"). InnerVolumeSpecName "kube-api-access-6gbwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.080432 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-config" (OuterVolumeSpecName: "config") pod "d2a5f17e-ef46-4471-b9bc-26133ef3760c" (UID: "d2a5f17e-ef46-4471-b9bc-26133ef3760c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.083048 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d2a5f17e-ef46-4471-b9bc-26133ef3760c" (UID: "d2a5f17e-ef46-4471-b9bc-26133ef3760c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.091630 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d2a5f17e-ef46-4471-b9bc-26133ef3760c" (UID: "d2a5f17e-ef46-4471-b9bc-26133ef3760c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.113317 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d2a5f17e-ef46-4471-b9bc-26133ef3760c" (UID: "d2a5f17e-ef46-4471-b9bc-26133ef3760c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.116680 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d2a5f17e-ef46-4471-b9bc-26133ef3760c" (UID: "d2a5f17e-ef46-4471-b9bc-26133ef3760c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.124552 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.124581 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.124594 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.124603 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gbwt\" (UniqueName: \"kubernetes.io/projected/d2a5f17e-ef46-4471-b9bc-26133ef3760c-kube-api-access-6gbwt\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.124612 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.124621 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2a5f17e-ef46-4471-b9bc-26133ef3760c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.299750 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78f49d79c7-7qk6g"] Dec 05 20:28:10 crc kubenswrapper[4885]: W1205 20:28:10.301272 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bb6d6a7_1ca1_4089_91e9_f8641f2f262e.slice/crio-27894b15b3ae9969fffe0a92ed00ebc3520e09c8a027f69c0b25d99868af27e7 WatchSource:0}: Error finding container 27894b15b3ae9969fffe0a92ed00ebc3520e09c8a027f69c0b25d99868af27e7: Status 404 returned error can't find the container with id 27894b15b3ae9969fffe0a92ed00ebc3520e09c8a027f69c0b25d99868af27e7 Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.313924 4885 generic.go:334] "Generic (PLEG): container finished" podID="d2a5f17e-ef46-4471-b9bc-26133ef3760c" containerID="04738de63af09e27861520d526ed66607eb4fa86bf7e70cada00561f3dc9a3e7" exitCode=0 Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.313963 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" event={"ID":"d2a5f17e-ef46-4471-b9bc-26133ef3760c","Type":"ContainerDied","Data":"04738de63af09e27861520d526ed66607eb4fa86bf7e70cada00561f3dc9a3e7"} Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.313990 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" event={"ID":"d2a5f17e-ef46-4471-b9bc-26133ef3760c","Type":"ContainerDied","Data":"dc14dfac320afbe3b7a0aac31d17ee472bd9ca6a71a642ebe2205adb8bb70794"} Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.314006 4885 scope.go:117] "RemoveContainer" containerID="04738de63af09e27861520d526ed66607eb4fa86bf7e70cada00561f3dc9a3e7" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.314140 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c8fb5597c-8bfq9" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.348944 4885 scope.go:117] "RemoveContainer" containerID="70e2e034599be4ed6f9b033150e871cca49087905a66f20d0342ad9bfb0f2ccc" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.351938 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c8fb5597c-8bfq9"] Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.360544 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c8fb5597c-8bfq9"] Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.384903 4885 scope.go:117] "RemoveContainer" containerID="04738de63af09e27861520d526ed66607eb4fa86bf7e70cada00561f3dc9a3e7" Dec 05 20:28:10 crc kubenswrapper[4885]: E1205 20:28:10.385405 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04738de63af09e27861520d526ed66607eb4fa86bf7e70cada00561f3dc9a3e7\": container with ID starting with 04738de63af09e27861520d526ed66607eb4fa86bf7e70cada00561f3dc9a3e7 not found: ID does not exist" containerID="04738de63af09e27861520d526ed66607eb4fa86bf7e70cada00561f3dc9a3e7" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.385428 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04738de63af09e27861520d526ed66607eb4fa86bf7e70cada00561f3dc9a3e7"} err="failed to get container status \"04738de63af09e27861520d526ed66607eb4fa86bf7e70cada00561f3dc9a3e7\": rpc error: code = NotFound desc = could not find container \"04738de63af09e27861520d526ed66607eb4fa86bf7e70cada00561f3dc9a3e7\": container with ID starting with 04738de63af09e27861520d526ed66607eb4fa86bf7e70cada00561f3dc9a3e7 not found: ID does not exist" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.385446 4885 scope.go:117] "RemoveContainer" containerID="70e2e034599be4ed6f9b033150e871cca49087905a66f20d0342ad9bfb0f2ccc" Dec 05 20:28:10 crc kubenswrapper[4885]: E1205 20:28:10.385711 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70e2e034599be4ed6f9b033150e871cca49087905a66f20d0342ad9bfb0f2ccc\": container with ID starting with 70e2e034599be4ed6f9b033150e871cca49087905a66f20d0342ad9bfb0f2ccc not found: ID does not exist" containerID="70e2e034599be4ed6f9b033150e871cca49087905a66f20d0342ad9bfb0f2ccc" Dec 05 20:28:10 crc kubenswrapper[4885]: I1205 20:28:10.385729 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70e2e034599be4ed6f9b033150e871cca49087905a66f20d0342ad9bfb0f2ccc"} err="failed to get container status \"70e2e034599be4ed6f9b033150e871cca49087905a66f20d0342ad9bfb0f2ccc\": rpc error: code = NotFound desc = could not find container \"70e2e034599be4ed6f9b033150e871cca49087905a66f20d0342ad9bfb0f2ccc\": container with ID starting with 70e2e034599be4ed6f9b033150e871cca49087905a66f20d0342ad9bfb0f2ccc not found: ID does not exist" Dec 05 20:28:11 crc kubenswrapper[4885]: I1205 20:28:11.188909 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2a5f17e-ef46-4471-b9bc-26133ef3760c" path="/var/lib/kubelet/pods/d2a5f17e-ef46-4471-b9bc-26133ef3760c/volumes" Dec 05 20:28:11 crc kubenswrapper[4885]: I1205 20:28:11.324149 4885 generic.go:334] "Generic (PLEG): container finished" podID="2bb6d6a7-1ca1-4089-91e9-f8641f2f262e" containerID="09343a0e8c625a78aceca964339e3ca8d2cd60ecc4fb6e160021cb4fb4888968" exitCode=0 Dec 05 20:28:11 crc kubenswrapper[4885]: I1205 20:28:11.324223 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" event={"ID":"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e","Type":"ContainerDied","Data":"09343a0e8c625a78aceca964339e3ca8d2cd60ecc4fb6e160021cb4fb4888968"} Dec 05 20:28:11 crc kubenswrapper[4885]: I1205 20:28:11.324250 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" event={"ID":"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e","Type":"ContainerStarted","Data":"27894b15b3ae9969fffe0a92ed00ebc3520e09c8a027f69c0b25d99868af27e7"} Dec 05 20:28:12 crc kubenswrapper[4885]: I1205 20:28:12.341708 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" event={"ID":"2bb6d6a7-1ca1-4089-91e9-f8641f2f262e","Type":"ContainerStarted","Data":"05fdd9c3c15d24c6f6d4f247fd27862f01593cc190f8eb88be3953f81d258088"} Dec 05 20:28:12 crc kubenswrapper[4885]: I1205 20:28:12.342098 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:12 crc kubenswrapper[4885]: I1205 20:28:12.376880 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" podStartSLOduration=3.376853676 podStartE2EDuration="3.376853676s" podCreationTimestamp="2025-12-05 20:28:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:28:12.362741426 +0000 UTC m=+1357.659557147" watchObservedRunningTime="2025-12-05 20:28:12.376853676 +0000 UTC m=+1357.673669347" Dec 05 20:28:16 crc kubenswrapper[4885]: I1205 20:28:16.631087 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:28:16 crc kubenswrapper[4885]: I1205 20:28:16.631721 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:28:16 crc kubenswrapper[4885]: I1205 20:28:16.631772 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:28:16 crc kubenswrapper[4885]: I1205 20:28:16.632643 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"91c26cde9f44964206a15bb12fc6d413d79858501fac35b74853db9d5b02ba34"} pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:28:16 crc kubenswrapper[4885]: I1205 20:28:16.632719 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" containerID="cri-o://91c26cde9f44964206a15bb12fc6d413d79858501fac35b74853db9d5b02ba34" gracePeriod=600 Dec 05 20:28:17 crc kubenswrapper[4885]: I1205 20:28:17.396234 4885 generic.go:334] "Generic (PLEG): container finished" podID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerID="91c26cde9f44964206a15bb12fc6d413d79858501fac35b74853db9d5b02ba34" exitCode=0 Dec 05 20:28:17 crc kubenswrapper[4885]: I1205 20:28:17.396854 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerDied","Data":"91c26cde9f44964206a15bb12fc6d413d79858501fac35b74853db9d5b02ba34"} Dec 05 20:28:17 crc kubenswrapper[4885]: I1205 20:28:17.396899 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba"} Dec 05 20:28:17 crc kubenswrapper[4885]: I1205 20:28:17.396920 4885 scope.go:117] "RemoveContainer" containerID="7059cc5d928871aedc23182a22e9ba744742e5284851e631b5de955d05b94f8c" Dec 05 20:28:19 crc kubenswrapper[4885]: I1205 20:28:19.793200 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78f49d79c7-7qk6g" Dec 05 20:28:19 crc kubenswrapper[4885]: I1205 20:28:19.878209 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f7bb59fc-hs4wf"] Dec 05 20:28:19 crc kubenswrapper[4885]: I1205 20:28:19.878479 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" podUID="863a7b08-7f6a-41e6-986c-307957d54f22" containerName="dnsmasq-dns" containerID="cri-o://11cded92537dc001eef1420511caf87408d7e1eeaa42605ff05687196ebdbe9a" gracePeriod=10 Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.424509 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.431480 4885 generic.go:334] "Generic (PLEG): container finished" podID="863a7b08-7f6a-41e6-986c-307957d54f22" containerID="11cded92537dc001eef1420511caf87408d7e1eeaa42605ff05687196ebdbe9a" exitCode=0 Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.431514 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" event={"ID":"863a7b08-7f6a-41e6-986c-307957d54f22","Type":"ContainerDied","Data":"11cded92537dc001eef1420511caf87408d7e1eeaa42605ff05687196ebdbe9a"} Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.431538 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" event={"ID":"863a7b08-7f6a-41e6-986c-307957d54f22","Type":"ContainerDied","Data":"427ce79fd980ebc9cf5d97325ad511b3721fcb64682011fdb2d5340612bde82e"} Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.431553 4885 scope.go:117] "RemoveContainer" containerID="11cded92537dc001eef1420511caf87408d7e1eeaa42605ff05687196ebdbe9a" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.431660 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f7bb59fc-hs4wf" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.457705 4885 scope.go:117] "RemoveContainer" containerID="82328be3f96a3240124057501827af3761a5c278f0269f1e2ba7464e80b4cb42" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.497656 4885 scope.go:117] "RemoveContainer" containerID="11cded92537dc001eef1420511caf87408d7e1eeaa42605ff05687196ebdbe9a" Dec 05 20:28:20 crc kubenswrapper[4885]: E1205 20:28:20.498671 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11cded92537dc001eef1420511caf87408d7e1eeaa42605ff05687196ebdbe9a\": container with ID starting with 11cded92537dc001eef1420511caf87408d7e1eeaa42605ff05687196ebdbe9a not found: ID does not exist" containerID="11cded92537dc001eef1420511caf87408d7e1eeaa42605ff05687196ebdbe9a" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.498724 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11cded92537dc001eef1420511caf87408d7e1eeaa42605ff05687196ebdbe9a"} err="failed to get container status \"11cded92537dc001eef1420511caf87408d7e1eeaa42605ff05687196ebdbe9a\": rpc error: code = NotFound desc = could not find container \"11cded92537dc001eef1420511caf87408d7e1eeaa42605ff05687196ebdbe9a\": container with ID starting with 11cded92537dc001eef1420511caf87408d7e1eeaa42605ff05687196ebdbe9a not found: ID does not exist" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.498759 4885 scope.go:117] "RemoveContainer" containerID="82328be3f96a3240124057501827af3761a5c278f0269f1e2ba7464e80b4cb42" Dec 05 20:28:20 crc kubenswrapper[4885]: E1205 20:28:20.499156 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82328be3f96a3240124057501827af3761a5c278f0269f1e2ba7464e80b4cb42\": container with ID starting with 82328be3f96a3240124057501827af3761a5c278f0269f1e2ba7464e80b4cb42 not found: ID does not exist" containerID="82328be3f96a3240124057501827af3761a5c278f0269f1e2ba7464e80b4cb42" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.499190 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82328be3f96a3240124057501827af3761a5c278f0269f1e2ba7464e80b4cb42"} err="failed to get container status \"82328be3f96a3240124057501827af3761a5c278f0269f1e2ba7464e80b4cb42\": rpc error: code = NotFound desc = could not find container \"82328be3f96a3240124057501827af3761a5c278f0269f1e2ba7464e80b4cb42\": container with ID starting with 82328be3f96a3240124057501827af3761a5c278f0269f1e2ba7464e80b4cb42 not found: ID does not exist" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.534545 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-ovsdbserver-sb\") pod \"863a7b08-7f6a-41e6-986c-307957d54f22\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.534620 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fxpm\" (UniqueName: \"kubernetes.io/projected/863a7b08-7f6a-41e6-986c-307957d54f22-kube-api-access-2fxpm\") pod \"863a7b08-7f6a-41e6-986c-307957d54f22\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.534667 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-dns-svc\") pod \"863a7b08-7f6a-41e6-986c-307957d54f22\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.534690 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-ovsdbserver-nb\") pod \"863a7b08-7f6a-41e6-986c-307957d54f22\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.534740 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-dns-swift-storage-0\") pod \"863a7b08-7f6a-41e6-986c-307957d54f22\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.534780 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-openstack-edpm-ipam\") pod \"863a7b08-7f6a-41e6-986c-307957d54f22\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.534837 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-config\") pod \"863a7b08-7f6a-41e6-986c-307957d54f22\" (UID: \"863a7b08-7f6a-41e6-986c-307957d54f22\") " Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.557972 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/863a7b08-7f6a-41e6-986c-307957d54f22-kube-api-access-2fxpm" (OuterVolumeSpecName: "kube-api-access-2fxpm") pod "863a7b08-7f6a-41e6-986c-307957d54f22" (UID: "863a7b08-7f6a-41e6-986c-307957d54f22"). InnerVolumeSpecName "kube-api-access-2fxpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.582082 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "863a7b08-7f6a-41e6-986c-307957d54f22" (UID: "863a7b08-7f6a-41e6-986c-307957d54f22"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.591652 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-config" (OuterVolumeSpecName: "config") pod "863a7b08-7f6a-41e6-986c-307957d54f22" (UID: "863a7b08-7f6a-41e6-986c-307957d54f22"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.594251 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "863a7b08-7f6a-41e6-986c-307957d54f22" (UID: "863a7b08-7f6a-41e6-986c-307957d54f22"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.599248 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "863a7b08-7f6a-41e6-986c-307957d54f22" (UID: "863a7b08-7f6a-41e6-986c-307957d54f22"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.599999 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "863a7b08-7f6a-41e6-986c-307957d54f22" (UID: "863a7b08-7f6a-41e6-986c-307957d54f22"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.617577 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "863a7b08-7f6a-41e6-986c-307957d54f22" (UID: "863a7b08-7f6a-41e6-986c-307957d54f22"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.637595 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.637628 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.637641 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fxpm\" (UniqueName: \"kubernetes.io/projected/863a7b08-7f6a-41e6-986c-307957d54f22-kube-api-access-2fxpm\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.637650 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.637660 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.637668 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.637676 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/863a7b08-7f6a-41e6-986c-307957d54f22-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.771215 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f7bb59fc-hs4wf"] Dec 05 20:28:20 crc kubenswrapper[4885]: I1205 20:28:20.779370 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f7bb59fc-hs4wf"] Dec 05 20:28:21 crc kubenswrapper[4885]: I1205 20:28:21.182704 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="863a7b08-7f6a-41e6-986c-307957d54f22" path="/var/lib/kubelet/pods/863a7b08-7f6a-41e6-986c-307957d54f22/volumes" Dec 05 20:28:31 crc kubenswrapper[4885]: I1205 20:28:31.583732 4885 generic.go:334] "Generic (PLEG): container finished" podID="cdc87c63-a124-485c-8f34-016d17a58f29" containerID="38638227110c2c40241d0963574b65c374d3749739a65faca3f91e3ea15b21c2" exitCode=0 Dec 05 20:28:31 crc kubenswrapper[4885]: I1205 20:28:31.583803 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cdc87c63-a124-485c-8f34-016d17a58f29","Type":"ContainerDied","Data":"38638227110c2c40241d0963574b65c374d3749739a65faca3f91e3ea15b21c2"} Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.593876 4885 generic.go:334] "Generic (PLEG): container finished" podID="38cec51a-a7b6-420f-8efe-f21b3acf2f3f" containerID="05ad1b319dc142620a79b0f4b33b49dfaeedae941116ff268d31806a9874e0cc" exitCode=0 Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.593977 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38cec51a-a7b6-420f-8efe-f21b3acf2f3f","Type":"ContainerDied","Data":"05ad1b319dc142620a79b0f4b33b49dfaeedae941116ff268d31806a9874e0cc"} Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.597182 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cdc87c63-a124-485c-8f34-016d17a58f29","Type":"ContainerStarted","Data":"427e0fad5fc66e639f11bda9dde83715ebca22c06408420bf176b622d83e9096"} Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.597432 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.650881 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.650858251 podStartE2EDuration="36.650858251s" podCreationTimestamp="2025-12-05 20:27:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:28:32.641264953 +0000 UTC m=+1377.938080614" watchObservedRunningTime="2025-12-05 20:28:32.650858251 +0000 UTC m=+1377.947673912" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.845295 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92"] Dec 05 20:28:32 crc kubenswrapper[4885]: E1205 20:28:32.845651 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a5f17e-ef46-4471-b9bc-26133ef3760c" containerName="dnsmasq-dns" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.845662 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a5f17e-ef46-4471-b9bc-26133ef3760c" containerName="dnsmasq-dns" Dec 05 20:28:32 crc kubenswrapper[4885]: E1205 20:28:32.845692 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863a7b08-7f6a-41e6-986c-307957d54f22" containerName="dnsmasq-dns" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.845699 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="863a7b08-7f6a-41e6-986c-307957d54f22" containerName="dnsmasq-dns" Dec 05 20:28:32 crc kubenswrapper[4885]: E1205 20:28:32.845709 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a5f17e-ef46-4471-b9bc-26133ef3760c" containerName="init" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.845715 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a5f17e-ef46-4471-b9bc-26133ef3760c" containerName="init" Dec 05 20:28:32 crc kubenswrapper[4885]: E1205 20:28:32.845734 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863a7b08-7f6a-41e6-986c-307957d54f22" containerName="init" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.845740 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="863a7b08-7f6a-41e6-986c-307957d54f22" containerName="init" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.845900 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a5f17e-ef46-4471-b9bc-26133ef3760c" containerName="dnsmasq-dns" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.845910 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="863a7b08-7f6a-41e6-986c-307957d54f22" containerName="dnsmasq-dns" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.846546 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.855827 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.855927 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.856045 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.856149 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.861651 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92"] Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.996443 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgnjp\" (UniqueName: \"kubernetes.io/projected/489dbc8e-e2ca-41aa-9e48-ca81bea02758-kube-api-access-tgnjp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92\" (UID: \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.996523 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92\" (UID: \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.996550 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92\" (UID: \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" Dec 05 20:28:32 crc kubenswrapper[4885]: I1205 20:28:32.996639 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92\" (UID: \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" Dec 05 20:28:33 crc kubenswrapper[4885]: I1205 20:28:33.098139 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgnjp\" (UniqueName: \"kubernetes.io/projected/489dbc8e-e2ca-41aa-9e48-ca81bea02758-kube-api-access-tgnjp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92\" (UID: \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" Dec 05 20:28:33 crc kubenswrapper[4885]: I1205 20:28:33.098239 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92\" (UID: \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" Dec 05 20:28:33 crc kubenswrapper[4885]: I1205 20:28:33.098273 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92\" (UID: \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" Dec 05 20:28:33 crc kubenswrapper[4885]: I1205 20:28:33.098374 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92\" (UID: \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" Dec 05 20:28:33 crc kubenswrapper[4885]: I1205 20:28:33.103104 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92\" (UID: \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" Dec 05 20:28:33 crc kubenswrapper[4885]: I1205 20:28:33.114781 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92\" (UID: \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" Dec 05 20:28:33 crc kubenswrapper[4885]: I1205 20:28:33.116178 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgnjp\" (UniqueName: \"kubernetes.io/projected/489dbc8e-e2ca-41aa-9e48-ca81bea02758-kube-api-access-tgnjp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92\" (UID: \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" Dec 05 20:28:33 crc kubenswrapper[4885]: I1205 20:28:33.116586 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92\" (UID: \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" Dec 05 20:28:33 crc kubenswrapper[4885]: I1205 20:28:33.165612 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" Dec 05 20:28:33 crc kubenswrapper[4885]: I1205 20:28:33.635559 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38cec51a-a7b6-420f-8efe-f21b3acf2f3f","Type":"ContainerStarted","Data":"d6f6a35fc7b62d706d25a2cbcace0ae62b390e0c4d1a63c6a188201580201c34"} Dec 05 20:28:33 crc kubenswrapper[4885]: I1205 20:28:33.636395 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:28:34 crc kubenswrapper[4885]: I1205 20:28:33.676762 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.676742046 podStartE2EDuration="36.676742046s" podCreationTimestamp="2025-12-05 20:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:28:33.670889264 +0000 UTC m=+1378.967704935" watchObservedRunningTime="2025-12-05 20:28:33.676742046 +0000 UTC m=+1378.973557717" Dec 05 20:28:34 crc kubenswrapper[4885]: I1205 20:28:33.750835 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92"] Dec 05 20:28:34 crc kubenswrapper[4885]: W1205 20:28:33.751143 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod489dbc8e_e2ca_41aa_9e48_ca81bea02758.slice/crio-ef90c319cd5c67b475a8621049c8d78ab28e03c0d3b2bae70e86918a38b2d895 WatchSource:0}: Error finding container ef90c319cd5c67b475a8621049c8d78ab28e03c0d3b2bae70e86918a38b2d895: Status 404 returned error can't find the container with id ef90c319cd5c67b475a8621049c8d78ab28e03c0d3b2bae70e86918a38b2d895 Dec 05 20:28:34 crc kubenswrapper[4885]: I1205 20:28:33.753532 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:28:34 crc kubenswrapper[4885]: I1205 20:28:34.650948 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" event={"ID":"489dbc8e-e2ca-41aa-9e48-ca81bea02758","Type":"ContainerStarted","Data":"ef90c319cd5c67b475a8621049c8d78ab28e03c0d3b2bae70e86918a38b2d895"} Dec 05 20:28:43 crc kubenswrapper[4885]: I1205 20:28:43.983138 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:28:44 crc kubenswrapper[4885]: I1205 20:28:44.758176 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" event={"ID":"489dbc8e-e2ca-41aa-9e48-ca81bea02758","Type":"ContainerStarted","Data":"bc87ca5094919fc32e93f35ea66fdac0c1c3201c63bb014bd351dcb8f0e10e48"} Dec 05 20:28:44 crc kubenswrapper[4885]: I1205 20:28:44.778734 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" podStartSLOduration=2.551455845 podStartE2EDuration="12.778715378s" podCreationTimestamp="2025-12-05 20:28:32 +0000 UTC" firstStartedPulling="2025-12-05 20:28:33.7533285 +0000 UTC m=+1379.050144161" lastFinishedPulling="2025-12-05 20:28:43.980588033 +0000 UTC m=+1389.277403694" observedRunningTime="2025-12-05 20:28:44.777677946 +0000 UTC m=+1390.074493627" watchObservedRunningTime="2025-12-05 20:28:44.778715378 +0000 UTC m=+1390.075531039" Dec 05 20:28:46 crc kubenswrapper[4885]: I1205 20:28:46.860236 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 20:28:47 crc kubenswrapper[4885]: I1205 20:28:47.597327 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:28:55 crc kubenswrapper[4885]: I1205 20:28:55.862442 4885 generic.go:334] "Generic (PLEG): container finished" podID="489dbc8e-e2ca-41aa-9e48-ca81bea02758" containerID="bc87ca5094919fc32e93f35ea66fdac0c1c3201c63bb014bd351dcb8f0e10e48" exitCode=0 Dec 05 20:28:55 crc kubenswrapper[4885]: I1205 20:28:55.862573 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" event={"ID":"489dbc8e-e2ca-41aa-9e48-ca81bea02758","Type":"ContainerDied","Data":"bc87ca5094919fc32e93f35ea66fdac0c1c3201c63bb014bd351dcb8f0e10e48"} Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.288256 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.324631 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-ssh-key\") pod \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\" (UID: \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\") " Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.324788 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgnjp\" (UniqueName: \"kubernetes.io/projected/489dbc8e-e2ca-41aa-9e48-ca81bea02758-kube-api-access-tgnjp\") pod \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\" (UID: \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\") " Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.324855 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-repo-setup-combined-ca-bundle\") pod \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\" (UID: \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\") " Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.324934 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-inventory\") pod \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\" (UID: \"489dbc8e-e2ca-41aa-9e48-ca81bea02758\") " Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.330479 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "489dbc8e-e2ca-41aa-9e48-ca81bea02758" (UID: "489dbc8e-e2ca-41aa-9e48-ca81bea02758"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.340496 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/489dbc8e-e2ca-41aa-9e48-ca81bea02758-kube-api-access-tgnjp" (OuterVolumeSpecName: "kube-api-access-tgnjp") pod "489dbc8e-e2ca-41aa-9e48-ca81bea02758" (UID: "489dbc8e-e2ca-41aa-9e48-ca81bea02758"). InnerVolumeSpecName "kube-api-access-tgnjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.357483 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-inventory" (OuterVolumeSpecName: "inventory") pod "489dbc8e-e2ca-41aa-9e48-ca81bea02758" (UID: "489dbc8e-e2ca-41aa-9e48-ca81bea02758"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.358337 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "489dbc8e-e2ca-41aa-9e48-ca81bea02758" (UID: "489dbc8e-e2ca-41aa-9e48-ca81bea02758"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.427486 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.427519 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.427530 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgnjp\" (UniqueName: \"kubernetes.io/projected/489dbc8e-e2ca-41aa-9e48-ca81bea02758-kube-api-access-tgnjp\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.427541 4885 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489dbc8e-e2ca-41aa-9e48-ca81bea02758-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.881532 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" event={"ID":"489dbc8e-e2ca-41aa-9e48-ca81bea02758","Type":"ContainerDied","Data":"ef90c319cd5c67b475a8621049c8d78ab28e03c0d3b2bae70e86918a38b2d895"} Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.881627 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef90c319cd5c67b475a8621049c8d78ab28e03c0d3b2bae70e86918a38b2d895" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.881997 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.967133 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz"] Dec 05 20:28:57 crc kubenswrapper[4885]: E1205 20:28:57.967765 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="489dbc8e-e2ca-41aa-9e48-ca81bea02758" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.967850 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="489dbc8e-e2ca-41aa-9e48-ca81bea02758" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.969951 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="489dbc8e-e2ca-41aa-9e48-ca81bea02758" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.970759 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.973357 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.974192 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.974220 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.974284 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:28:57 crc kubenswrapper[4885]: I1205 20:28:57.983002 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz"] Dec 05 20:28:58 crc kubenswrapper[4885]: I1205 20:28:58.041159 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwxgt\" (UniqueName: \"kubernetes.io/projected/a40c582a-e811-4e60-a7fe-1bf467d32e96-kube-api-access-lwxgt\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdzlz\" (UID: \"a40c582a-e811-4e60-a7fe-1bf467d32e96\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" Dec 05 20:28:58 crc kubenswrapper[4885]: I1205 20:28:58.041275 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a40c582a-e811-4e60-a7fe-1bf467d32e96-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdzlz\" (UID: \"a40c582a-e811-4e60-a7fe-1bf467d32e96\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" Dec 05 20:28:58 crc kubenswrapper[4885]: I1205 20:28:58.041303 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a40c582a-e811-4e60-a7fe-1bf467d32e96-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdzlz\" (UID: \"a40c582a-e811-4e60-a7fe-1bf467d32e96\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" Dec 05 20:28:58 crc kubenswrapper[4885]: I1205 20:28:58.143644 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a40c582a-e811-4e60-a7fe-1bf467d32e96-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdzlz\" (UID: \"a40c582a-e811-4e60-a7fe-1bf467d32e96\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" Dec 05 20:28:58 crc kubenswrapper[4885]: I1205 20:28:58.143952 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a40c582a-e811-4e60-a7fe-1bf467d32e96-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdzlz\" (UID: \"a40c582a-e811-4e60-a7fe-1bf467d32e96\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" Dec 05 20:28:58 crc kubenswrapper[4885]: I1205 20:28:58.144293 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwxgt\" (UniqueName: \"kubernetes.io/projected/a40c582a-e811-4e60-a7fe-1bf467d32e96-kube-api-access-lwxgt\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdzlz\" (UID: \"a40c582a-e811-4e60-a7fe-1bf467d32e96\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" Dec 05 20:28:58 crc kubenswrapper[4885]: I1205 20:28:58.150310 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a40c582a-e811-4e60-a7fe-1bf467d32e96-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdzlz\" (UID: \"a40c582a-e811-4e60-a7fe-1bf467d32e96\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" Dec 05 20:28:58 crc kubenswrapper[4885]: I1205 20:28:58.152436 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a40c582a-e811-4e60-a7fe-1bf467d32e96-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdzlz\" (UID: \"a40c582a-e811-4e60-a7fe-1bf467d32e96\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" Dec 05 20:28:58 crc kubenswrapper[4885]: I1205 20:28:58.167414 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwxgt\" (UniqueName: \"kubernetes.io/projected/a40c582a-e811-4e60-a7fe-1bf467d32e96-kube-api-access-lwxgt\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdzlz\" (UID: \"a40c582a-e811-4e60-a7fe-1bf467d32e96\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" Dec 05 20:28:58 crc kubenswrapper[4885]: I1205 20:28:58.306004 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" Dec 05 20:28:58 crc kubenswrapper[4885]: I1205 20:28:58.845375 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz"] Dec 05 20:28:58 crc kubenswrapper[4885]: I1205 20:28:58.900216 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" event={"ID":"a40c582a-e811-4e60-a7fe-1bf467d32e96","Type":"ContainerStarted","Data":"07508bf1a7ac9dbe20b3967bc1d529976b40cc3cda2e8f3427244cdf5fbbcafc"} Dec 05 20:28:59 crc kubenswrapper[4885]: I1205 20:28:59.912124 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" event={"ID":"a40c582a-e811-4e60-a7fe-1bf467d32e96","Type":"ContainerStarted","Data":"54de3e9a8863b0a8e9aa58da0456fb1ac72db67de88db9fa1aee7e5b78cc4925"} Dec 05 20:28:59 crc kubenswrapper[4885]: I1205 20:28:59.946247 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" podStartSLOduration=2.525756057 podStartE2EDuration="2.946219376s" podCreationTimestamp="2025-12-05 20:28:57 +0000 UTC" firstStartedPulling="2025-12-05 20:28:58.863013156 +0000 UTC m=+1404.159828817" lastFinishedPulling="2025-12-05 20:28:59.283476475 +0000 UTC m=+1404.580292136" observedRunningTime="2025-12-05 20:28:59.927507933 +0000 UTC m=+1405.224323614" watchObservedRunningTime="2025-12-05 20:28:59.946219376 +0000 UTC m=+1405.243035057" Dec 05 20:29:02 crc kubenswrapper[4885]: I1205 20:29:02.943911 4885 generic.go:334] "Generic (PLEG): container finished" podID="a40c582a-e811-4e60-a7fe-1bf467d32e96" containerID="54de3e9a8863b0a8e9aa58da0456fb1ac72db67de88db9fa1aee7e5b78cc4925" exitCode=0 Dec 05 20:29:02 crc kubenswrapper[4885]: I1205 20:29:02.943984 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" event={"ID":"a40c582a-e811-4e60-a7fe-1bf467d32e96","Type":"ContainerDied","Data":"54de3e9a8863b0a8e9aa58da0456fb1ac72db67de88db9fa1aee7e5b78cc4925"} Dec 05 20:29:04 crc kubenswrapper[4885]: I1205 20:29:04.621608 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" Dec 05 20:29:04 crc kubenswrapper[4885]: I1205 20:29:04.776409 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a40c582a-e811-4e60-a7fe-1bf467d32e96-inventory\") pod \"a40c582a-e811-4e60-a7fe-1bf467d32e96\" (UID: \"a40c582a-e811-4e60-a7fe-1bf467d32e96\") " Dec 05 20:29:04 crc kubenswrapper[4885]: I1205 20:29:04.777310 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a40c582a-e811-4e60-a7fe-1bf467d32e96-ssh-key\") pod \"a40c582a-e811-4e60-a7fe-1bf467d32e96\" (UID: \"a40c582a-e811-4e60-a7fe-1bf467d32e96\") " Dec 05 20:29:04 crc kubenswrapper[4885]: I1205 20:29:04.777474 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwxgt\" (UniqueName: \"kubernetes.io/projected/a40c582a-e811-4e60-a7fe-1bf467d32e96-kube-api-access-lwxgt\") pod \"a40c582a-e811-4e60-a7fe-1bf467d32e96\" (UID: \"a40c582a-e811-4e60-a7fe-1bf467d32e96\") " Dec 05 20:29:04 crc kubenswrapper[4885]: I1205 20:29:04.782679 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a40c582a-e811-4e60-a7fe-1bf467d32e96-kube-api-access-lwxgt" (OuterVolumeSpecName: "kube-api-access-lwxgt") pod "a40c582a-e811-4e60-a7fe-1bf467d32e96" (UID: "a40c582a-e811-4e60-a7fe-1bf467d32e96"). InnerVolumeSpecName "kube-api-access-lwxgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:29:04 crc kubenswrapper[4885]: I1205 20:29:04.810885 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a40c582a-e811-4e60-a7fe-1bf467d32e96-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a40c582a-e811-4e60-a7fe-1bf467d32e96" (UID: "a40c582a-e811-4e60-a7fe-1bf467d32e96"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:29:04 crc kubenswrapper[4885]: I1205 20:29:04.824949 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a40c582a-e811-4e60-a7fe-1bf467d32e96-inventory" (OuterVolumeSpecName: "inventory") pod "a40c582a-e811-4e60-a7fe-1bf467d32e96" (UID: "a40c582a-e811-4e60-a7fe-1bf467d32e96"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:29:04 crc kubenswrapper[4885]: I1205 20:29:04.879312 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwxgt\" (UniqueName: \"kubernetes.io/projected/a40c582a-e811-4e60-a7fe-1bf467d32e96-kube-api-access-lwxgt\") on node \"crc\" DevicePath \"\"" Dec 05 20:29:04 crc kubenswrapper[4885]: I1205 20:29:04.879343 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a40c582a-e811-4e60-a7fe-1bf467d32e96-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:29:04 crc kubenswrapper[4885]: I1205 20:29:04.879355 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a40c582a-e811-4e60-a7fe-1bf467d32e96-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:29:04 crc kubenswrapper[4885]: I1205 20:29:04.961626 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" event={"ID":"a40c582a-e811-4e60-a7fe-1bf467d32e96","Type":"ContainerDied","Data":"07508bf1a7ac9dbe20b3967bc1d529976b40cc3cda2e8f3427244cdf5fbbcafc"} Dec 05 20:29:04 crc kubenswrapper[4885]: I1205 20:29:04.961670 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07508bf1a7ac9dbe20b3967bc1d529976b40cc3cda2e8f3427244cdf5fbbcafc" Dec 05 20:29:04 crc kubenswrapper[4885]: I1205 20:29:04.961679 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdzlz" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.031084 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d"] Dec 05 20:29:05 crc kubenswrapper[4885]: E1205 20:29:05.031677 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a40c582a-e811-4e60-a7fe-1bf467d32e96" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.031699 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a40c582a-e811-4e60-a7fe-1bf467d32e96" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.031920 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a40c582a-e811-4e60-a7fe-1bf467d32e96" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.034761 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.037097 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.037278 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.037484 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.037957 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.045957 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d"] Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.184038 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d\" (UID: \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.184118 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9fkb\" (UniqueName: \"kubernetes.io/projected/54bae71b-4af1-49b5-a41b-58e6aafd26ca-kube-api-access-m9fkb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d\" (UID: \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.184203 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d\" (UID: \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.184381 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d\" (UID: \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.285980 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d\" (UID: \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.286054 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d\" (UID: \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.286084 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9fkb\" (UniqueName: \"kubernetes.io/projected/54bae71b-4af1-49b5-a41b-58e6aafd26ca-kube-api-access-m9fkb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d\" (UID: \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.286160 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d\" (UID: \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.290525 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d\" (UID: \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.290651 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d\" (UID: \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.290719 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d\" (UID: \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.302780 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9fkb\" (UniqueName: \"kubernetes.io/projected/54bae71b-4af1-49b5-a41b-58e6aafd26ca-kube-api-access-m9fkb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d\" (UID: \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.353782 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.902218 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d"] Dec 05 20:29:05 crc kubenswrapper[4885]: I1205 20:29:05.981092 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" event={"ID":"54bae71b-4af1-49b5-a41b-58e6aafd26ca","Type":"ContainerStarted","Data":"a7fa112c93bfbf516893c5f2f0a7d96344e8dd908f2a41edd929cf4e9133b406"} Dec 05 20:29:06 crc kubenswrapper[4885]: I1205 20:29:06.992532 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" event={"ID":"54bae71b-4af1-49b5-a41b-58e6aafd26ca","Type":"ContainerStarted","Data":"b4cc0f027ba466b0b5ab06d395b77d6e53abde1bb662ee8e59bb8b731a8aa383"} Dec 05 20:29:07 crc kubenswrapper[4885]: I1205 20:29:07.018923 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" podStartSLOduration=1.576137558 podStartE2EDuration="2.01889995s" podCreationTimestamp="2025-12-05 20:29:05 +0000 UTC" firstStartedPulling="2025-12-05 20:29:05.90351904 +0000 UTC m=+1411.200334741" lastFinishedPulling="2025-12-05 20:29:06.346281452 +0000 UTC m=+1411.643097133" observedRunningTime="2025-12-05 20:29:07.009638362 +0000 UTC m=+1412.306454043" watchObservedRunningTime="2025-12-05 20:29:07.01889995 +0000 UTC m=+1412.315715621" Dec 05 20:29:11 crc kubenswrapper[4885]: I1205 20:29:11.815255 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s2v26"] Dec 05 20:29:11 crc kubenswrapper[4885]: I1205 20:29:11.818337 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:11 crc kubenswrapper[4885]: I1205 20:29:11.833570 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s2v26"] Dec 05 20:29:12 crc kubenswrapper[4885]: I1205 20:29:12.011053 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04947505-ce67-49df-8945-fd6dce52c75d-catalog-content\") pod \"redhat-operators-s2v26\" (UID: \"04947505-ce67-49df-8945-fd6dce52c75d\") " pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:12 crc kubenswrapper[4885]: I1205 20:29:12.011161 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcjdn\" (UniqueName: \"kubernetes.io/projected/04947505-ce67-49df-8945-fd6dce52c75d-kube-api-access-rcjdn\") pod \"redhat-operators-s2v26\" (UID: \"04947505-ce67-49df-8945-fd6dce52c75d\") " pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:12 crc kubenswrapper[4885]: I1205 20:29:12.011193 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04947505-ce67-49df-8945-fd6dce52c75d-utilities\") pod \"redhat-operators-s2v26\" (UID: \"04947505-ce67-49df-8945-fd6dce52c75d\") " pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:12 crc kubenswrapper[4885]: I1205 20:29:12.112837 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04947505-ce67-49df-8945-fd6dce52c75d-catalog-content\") pod \"redhat-operators-s2v26\" (UID: \"04947505-ce67-49df-8945-fd6dce52c75d\") " pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:12 crc kubenswrapper[4885]: I1205 20:29:12.113361 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcjdn\" (UniqueName: \"kubernetes.io/projected/04947505-ce67-49df-8945-fd6dce52c75d-kube-api-access-rcjdn\") pod \"redhat-operators-s2v26\" (UID: \"04947505-ce67-49df-8945-fd6dce52c75d\") " pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:12 crc kubenswrapper[4885]: I1205 20:29:12.113446 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04947505-ce67-49df-8945-fd6dce52c75d-catalog-content\") pod \"redhat-operators-s2v26\" (UID: \"04947505-ce67-49df-8945-fd6dce52c75d\") " pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:12 crc kubenswrapper[4885]: I1205 20:29:12.113466 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04947505-ce67-49df-8945-fd6dce52c75d-utilities\") pod \"redhat-operators-s2v26\" (UID: \"04947505-ce67-49df-8945-fd6dce52c75d\") " pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:12 crc kubenswrapper[4885]: I1205 20:29:12.113823 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04947505-ce67-49df-8945-fd6dce52c75d-utilities\") pod \"redhat-operators-s2v26\" (UID: \"04947505-ce67-49df-8945-fd6dce52c75d\") " pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:12 crc kubenswrapper[4885]: I1205 20:29:12.134332 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcjdn\" (UniqueName: \"kubernetes.io/projected/04947505-ce67-49df-8945-fd6dce52c75d-kube-api-access-rcjdn\") pod \"redhat-operators-s2v26\" (UID: \"04947505-ce67-49df-8945-fd6dce52c75d\") " pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:12 crc kubenswrapper[4885]: I1205 20:29:12.159772 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:12 crc kubenswrapper[4885]: I1205 20:29:12.752083 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s2v26"] Dec 05 20:29:13 crc kubenswrapper[4885]: I1205 20:29:13.051386 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2v26" event={"ID":"04947505-ce67-49df-8945-fd6dce52c75d","Type":"ContainerStarted","Data":"af287baf9b5dc4bb23fe9d48473fc6230be56d752f8e7214a9a84710c9e29f4f"} Dec 05 20:29:14 crc kubenswrapper[4885]: I1205 20:29:14.064155 4885 generic.go:334] "Generic (PLEG): container finished" podID="04947505-ce67-49df-8945-fd6dce52c75d" containerID="166007db6f09af1a497987e003decee16561cdb9c33416e4a19e4aeb54b13980" exitCode=0 Dec 05 20:29:14 crc kubenswrapper[4885]: I1205 20:29:14.065234 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2v26" event={"ID":"04947505-ce67-49df-8945-fd6dce52c75d","Type":"ContainerDied","Data":"166007db6f09af1a497987e003decee16561cdb9c33416e4a19e4aeb54b13980"} Dec 05 20:29:15 crc kubenswrapper[4885]: I1205 20:29:15.076852 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2v26" event={"ID":"04947505-ce67-49df-8945-fd6dce52c75d","Type":"ContainerStarted","Data":"2152578b871ed3585e4215059f758da54b3a0f4d9280ba7abde61e34b3ad2d24"} Dec 05 20:29:17 crc kubenswrapper[4885]: I1205 20:29:17.104602 4885 generic.go:334] "Generic (PLEG): container finished" podID="04947505-ce67-49df-8945-fd6dce52c75d" containerID="2152578b871ed3585e4215059f758da54b3a0f4d9280ba7abde61e34b3ad2d24" exitCode=0 Dec 05 20:29:17 crc kubenswrapper[4885]: I1205 20:29:17.104656 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2v26" event={"ID":"04947505-ce67-49df-8945-fd6dce52c75d","Type":"ContainerDied","Data":"2152578b871ed3585e4215059f758da54b3a0f4d9280ba7abde61e34b3ad2d24"} Dec 05 20:29:18 crc kubenswrapper[4885]: I1205 20:29:18.122394 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2v26" event={"ID":"04947505-ce67-49df-8945-fd6dce52c75d","Type":"ContainerStarted","Data":"42a9983e5760a5d62fb16af2b04070982edb10e1345137a0354cf00ea4060b43"} Dec 05 20:29:18 crc kubenswrapper[4885]: I1205 20:29:18.147227 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s2v26" podStartSLOduration=3.518860455 podStartE2EDuration="7.147206452s" podCreationTimestamp="2025-12-05 20:29:11 +0000 UTC" firstStartedPulling="2025-12-05 20:29:14.067882687 +0000 UTC m=+1419.364698358" lastFinishedPulling="2025-12-05 20:29:17.696228694 +0000 UTC m=+1422.993044355" observedRunningTime="2025-12-05 20:29:18.14551074 +0000 UTC m=+1423.442326411" watchObservedRunningTime="2025-12-05 20:29:18.147206452 +0000 UTC m=+1423.444022113" Dec 05 20:29:22 crc kubenswrapper[4885]: I1205 20:29:22.160406 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:22 crc kubenswrapper[4885]: I1205 20:29:22.161050 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:23 crc kubenswrapper[4885]: I1205 20:29:23.226519 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s2v26" podUID="04947505-ce67-49df-8945-fd6dce52c75d" containerName="registry-server" probeResult="failure" output=< Dec 05 20:29:23 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Dec 05 20:29:23 crc kubenswrapper[4885]: > Dec 05 20:29:32 crc kubenswrapper[4885]: I1205 20:29:32.258527 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:32 crc kubenswrapper[4885]: I1205 20:29:32.317575 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:32 crc kubenswrapper[4885]: I1205 20:29:32.507325 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s2v26"] Dec 05 20:29:34 crc kubenswrapper[4885]: I1205 20:29:34.271711 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s2v26" podUID="04947505-ce67-49df-8945-fd6dce52c75d" containerName="registry-server" containerID="cri-o://42a9983e5760a5d62fb16af2b04070982edb10e1345137a0354cf00ea4060b43" gracePeriod=2 Dec 05 20:29:34 crc kubenswrapper[4885]: I1205 20:29:34.710504 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:34 crc kubenswrapper[4885]: I1205 20:29:34.756122 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04947505-ce67-49df-8945-fd6dce52c75d-catalog-content\") pod \"04947505-ce67-49df-8945-fd6dce52c75d\" (UID: \"04947505-ce67-49df-8945-fd6dce52c75d\") " Dec 05 20:29:34 crc kubenswrapper[4885]: I1205 20:29:34.756174 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04947505-ce67-49df-8945-fd6dce52c75d-utilities\") pod \"04947505-ce67-49df-8945-fd6dce52c75d\" (UID: \"04947505-ce67-49df-8945-fd6dce52c75d\") " Dec 05 20:29:34 crc kubenswrapper[4885]: I1205 20:29:34.756338 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcjdn\" (UniqueName: \"kubernetes.io/projected/04947505-ce67-49df-8945-fd6dce52c75d-kube-api-access-rcjdn\") pod \"04947505-ce67-49df-8945-fd6dce52c75d\" (UID: \"04947505-ce67-49df-8945-fd6dce52c75d\") " Dec 05 20:29:34 crc kubenswrapper[4885]: I1205 20:29:34.757190 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04947505-ce67-49df-8945-fd6dce52c75d-utilities" (OuterVolumeSpecName: "utilities") pod "04947505-ce67-49df-8945-fd6dce52c75d" (UID: "04947505-ce67-49df-8945-fd6dce52c75d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:29:34 crc kubenswrapper[4885]: I1205 20:29:34.762859 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04947505-ce67-49df-8945-fd6dce52c75d-kube-api-access-rcjdn" (OuterVolumeSpecName: "kube-api-access-rcjdn") pod "04947505-ce67-49df-8945-fd6dce52c75d" (UID: "04947505-ce67-49df-8945-fd6dce52c75d"). InnerVolumeSpecName "kube-api-access-rcjdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:29:34 crc kubenswrapper[4885]: I1205 20:29:34.858504 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04947505-ce67-49df-8945-fd6dce52c75d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:29:34 crc kubenswrapper[4885]: I1205 20:29:34.858738 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcjdn\" (UniqueName: \"kubernetes.io/projected/04947505-ce67-49df-8945-fd6dce52c75d-kube-api-access-rcjdn\") on node \"crc\" DevicePath \"\"" Dec 05 20:29:34 crc kubenswrapper[4885]: I1205 20:29:34.892073 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04947505-ce67-49df-8945-fd6dce52c75d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04947505-ce67-49df-8945-fd6dce52c75d" (UID: "04947505-ce67-49df-8945-fd6dce52c75d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:29:34 crc kubenswrapper[4885]: I1205 20:29:34.961281 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04947505-ce67-49df-8945-fd6dce52c75d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:29:35 crc kubenswrapper[4885]: I1205 20:29:35.282303 4885 generic.go:334] "Generic (PLEG): container finished" podID="04947505-ce67-49df-8945-fd6dce52c75d" containerID="42a9983e5760a5d62fb16af2b04070982edb10e1345137a0354cf00ea4060b43" exitCode=0 Dec 05 20:29:35 crc kubenswrapper[4885]: I1205 20:29:35.282386 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s2v26" Dec 05 20:29:35 crc kubenswrapper[4885]: I1205 20:29:35.282389 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2v26" event={"ID":"04947505-ce67-49df-8945-fd6dce52c75d","Type":"ContainerDied","Data":"42a9983e5760a5d62fb16af2b04070982edb10e1345137a0354cf00ea4060b43"} Dec 05 20:29:35 crc kubenswrapper[4885]: I1205 20:29:35.283858 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2v26" event={"ID":"04947505-ce67-49df-8945-fd6dce52c75d","Type":"ContainerDied","Data":"af287baf9b5dc4bb23fe9d48473fc6230be56d752f8e7214a9a84710c9e29f4f"} Dec 05 20:29:35 crc kubenswrapper[4885]: I1205 20:29:35.283885 4885 scope.go:117] "RemoveContainer" containerID="42a9983e5760a5d62fb16af2b04070982edb10e1345137a0354cf00ea4060b43" Dec 05 20:29:35 crc kubenswrapper[4885]: I1205 20:29:35.313812 4885 scope.go:117] "RemoveContainer" containerID="2152578b871ed3585e4215059f758da54b3a0f4d9280ba7abde61e34b3ad2d24" Dec 05 20:29:35 crc kubenswrapper[4885]: I1205 20:29:35.314551 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s2v26"] Dec 05 20:29:35 crc kubenswrapper[4885]: I1205 20:29:35.322043 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s2v26"] Dec 05 20:29:35 crc kubenswrapper[4885]: I1205 20:29:35.351299 4885 scope.go:117] "RemoveContainer" containerID="166007db6f09af1a497987e003decee16561cdb9c33416e4a19e4aeb54b13980" Dec 05 20:29:35 crc kubenswrapper[4885]: I1205 20:29:35.380834 4885 scope.go:117] "RemoveContainer" containerID="42a9983e5760a5d62fb16af2b04070982edb10e1345137a0354cf00ea4060b43" Dec 05 20:29:35 crc kubenswrapper[4885]: E1205 20:29:35.381280 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42a9983e5760a5d62fb16af2b04070982edb10e1345137a0354cf00ea4060b43\": container with ID starting with 42a9983e5760a5d62fb16af2b04070982edb10e1345137a0354cf00ea4060b43 not found: ID does not exist" containerID="42a9983e5760a5d62fb16af2b04070982edb10e1345137a0354cf00ea4060b43" Dec 05 20:29:35 crc kubenswrapper[4885]: I1205 20:29:35.381310 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42a9983e5760a5d62fb16af2b04070982edb10e1345137a0354cf00ea4060b43"} err="failed to get container status \"42a9983e5760a5d62fb16af2b04070982edb10e1345137a0354cf00ea4060b43\": rpc error: code = NotFound desc = could not find container \"42a9983e5760a5d62fb16af2b04070982edb10e1345137a0354cf00ea4060b43\": container with ID starting with 42a9983e5760a5d62fb16af2b04070982edb10e1345137a0354cf00ea4060b43 not found: ID does not exist" Dec 05 20:29:35 crc kubenswrapper[4885]: I1205 20:29:35.381331 4885 scope.go:117] "RemoveContainer" containerID="2152578b871ed3585e4215059f758da54b3a0f4d9280ba7abde61e34b3ad2d24" Dec 05 20:29:35 crc kubenswrapper[4885]: E1205 20:29:35.381676 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2152578b871ed3585e4215059f758da54b3a0f4d9280ba7abde61e34b3ad2d24\": container with ID starting with 2152578b871ed3585e4215059f758da54b3a0f4d9280ba7abde61e34b3ad2d24 not found: ID does not exist" containerID="2152578b871ed3585e4215059f758da54b3a0f4d9280ba7abde61e34b3ad2d24" Dec 05 20:29:35 crc kubenswrapper[4885]: I1205 20:29:35.381697 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2152578b871ed3585e4215059f758da54b3a0f4d9280ba7abde61e34b3ad2d24"} err="failed to get container status \"2152578b871ed3585e4215059f758da54b3a0f4d9280ba7abde61e34b3ad2d24\": rpc error: code = NotFound desc = could not find container \"2152578b871ed3585e4215059f758da54b3a0f4d9280ba7abde61e34b3ad2d24\": container with ID starting with 2152578b871ed3585e4215059f758da54b3a0f4d9280ba7abde61e34b3ad2d24 not found: ID does not exist" Dec 05 20:29:35 crc kubenswrapper[4885]: I1205 20:29:35.381708 4885 scope.go:117] "RemoveContainer" containerID="166007db6f09af1a497987e003decee16561cdb9c33416e4a19e4aeb54b13980" Dec 05 20:29:35 crc kubenswrapper[4885]: E1205 20:29:35.381984 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"166007db6f09af1a497987e003decee16561cdb9c33416e4a19e4aeb54b13980\": container with ID starting with 166007db6f09af1a497987e003decee16561cdb9c33416e4a19e4aeb54b13980 not found: ID does not exist" containerID="166007db6f09af1a497987e003decee16561cdb9c33416e4a19e4aeb54b13980" Dec 05 20:29:35 crc kubenswrapper[4885]: I1205 20:29:35.382004 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"166007db6f09af1a497987e003decee16561cdb9c33416e4a19e4aeb54b13980"} err="failed to get container status \"166007db6f09af1a497987e003decee16561cdb9c33416e4a19e4aeb54b13980\": rpc error: code = NotFound desc = could not find container \"166007db6f09af1a497987e003decee16561cdb9c33416e4a19e4aeb54b13980\": container with ID starting with 166007db6f09af1a497987e003decee16561cdb9c33416e4a19e4aeb54b13980 not found: ID does not exist" Dec 05 20:29:37 crc kubenswrapper[4885]: I1205 20:29:37.183473 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04947505-ce67-49df-8945-fd6dce52c75d" path="/var/lib/kubelet/pods/04947505-ce67-49df-8945-fd6dce52c75d/volumes" Dec 05 20:29:40 crc kubenswrapper[4885]: E1205 20:29:40.191574 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04947505_ce67_49df_8945_fd6dce52c75d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04947505_ce67_49df_8945_fd6dce52c75d.slice/crio-af287baf9b5dc4bb23fe9d48473fc6230be56d752f8e7214a9a84710c9e29f4f\": RecentStats: unable to find data in memory cache]" Dec 05 20:29:50 crc kubenswrapper[4885]: E1205 20:29:50.503993 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04947505_ce67_49df_8945_fd6dce52c75d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04947505_ce67_49df_8945_fd6dce52c75d.slice/crio-af287baf9b5dc4bb23fe9d48473fc6230be56d752f8e7214a9a84710c9e29f4f\": RecentStats: unable to find data in memory cache]" Dec 05 20:29:52 crc kubenswrapper[4885]: I1205 20:29:52.988676 4885 scope.go:117] "RemoveContainer" containerID="f25a0fc00444ba0cebd20b21f60f8abe3a689a707ee249c082905f312d12a095" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.164084 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt"] Dec 05 20:30:00 crc kubenswrapper[4885]: E1205 20:30:00.165582 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04947505-ce67-49df-8945-fd6dce52c75d" containerName="registry-server" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.165605 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="04947505-ce67-49df-8945-fd6dce52c75d" containerName="registry-server" Dec 05 20:30:00 crc kubenswrapper[4885]: E1205 20:30:00.165675 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04947505-ce67-49df-8945-fd6dce52c75d" containerName="extract-utilities" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.165688 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="04947505-ce67-49df-8945-fd6dce52c75d" containerName="extract-utilities" Dec 05 20:30:00 crc kubenswrapper[4885]: E1205 20:30:00.165707 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04947505-ce67-49df-8945-fd6dce52c75d" containerName="extract-content" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.165719 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="04947505-ce67-49df-8945-fd6dce52c75d" containerName="extract-content" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.166050 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="04947505-ce67-49df-8945-fd6dce52c75d" containerName="registry-server" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.166981 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.170260 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.170638 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.175439 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt"] Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.286142 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-config-volume\") pod \"collect-profiles-29416110-ww4gt\" (UID: \"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.286733 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dll59\" (UniqueName: \"kubernetes.io/projected/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-kube-api-access-dll59\") pod \"collect-profiles-29416110-ww4gt\" (UID: \"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.286826 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-secret-volume\") pod \"collect-profiles-29416110-ww4gt\" (UID: \"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.388552 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-config-volume\") pod \"collect-profiles-29416110-ww4gt\" (UID: \"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.388735 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dll59\" (UniqueName: \"kubernetes.io/projected/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-kube-api-access-dll59\") pod \"collect-profiles-29416110-ww4gt\" (UID: \"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.388775 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-secret-volume\") pod \"collect-profiles-29416110-ww4gt\" (UID: \"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.389476 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-config-volume\") pod \"collect-profiles-29416110-ww4gt\" (UID: \"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.395994 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-secret-volume\") pod \"collect-profiles-29416110-ww4gt\" (UID: \"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.407532 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dll59\" (UniqueName: \"kubernetes.io/projected/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-kube-api-access-dll59\") pod \"collect-profiles-29416110-ww4gt\" (UID: \"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.534611 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt" Dec 05 20:30:00 crc kubenswrapper[4885]: E1205 20:30:00.760482 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04947505_ce67_49df_8945_fd6dce52c75d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04947505_ce67_49df_8945_fd6dce52c75d.slice/crio-af287baf9b5dc4bb23fe9d48473fc6230be56d752f8e7214a9a84710c9e29f4f\": RecentStats: unable to find data in memory cache]" Dec 05 20:30:00 crc kubenswrapper[4885]: I1205 20:30:00.963207 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt"] Dec 05 20:30:01 crc kubenswrapper[4885]: I1205 20:30:01.612823 4885 generic.go:334] "Generic (PLEG): container finished" podID="d5d81118-a04d-40a2-bfbc-8cdcb5e0b301" containerID="0e99a558ebe9c458cbc59431f2536417d392456353f9d882bed807862382f5ec" exitCode=0 Dec 05 20:30:01 crc kubenswrapper[4885]: I1205 20:30:01.612888 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt" event={"ID":"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301","Type":"ContainerDied","Data":"0e99a558ebe9c458cbc59431f2536417d392456353f9d882bed807862382f5ec"} Dec 05 20:30:01 crc kubenswrapper[4885]: I1205 20:30:01.613266 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt" event={"ID":"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301","Type":"ContainerStarted","Data":"60a98dea21509bfc6c6694f1e4396f7ef7d6986fcde0363bd73c3279443b9a29"} Dec 05 20:30:02 crc kubenswrapper[4885]: I1205 20:30:02.938300 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt" Dec 05 20:30:02 crc kubenswrapper[4885]: I1205 20:30:02.946844 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dll59\" (UniqueName: \"kubernetes.io/projected/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-kube-api-access-dll59\") pod \"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301\" (UID: \"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301\") " Dec 05 20:30:02 crc kubenswrapper[4885]: I1205 20:30:02.946932 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-secret-volume\") pod \"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301\" (UID: \"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301\") " Dec 05 20:30:02 crc kubenswrapper[4885]: I1205 20:30:02.946974 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-config-volume\") pod \"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301\" (UID: \"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301\") " Dec 05 20:30:02 crc kubenswrapper[4885]: I1205 20:30:02.947839 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-config-volume" (OuterVolumeSpecName: "config-volume") pod "d5d81118-a04d-40a2-bfbc-8cdcb5e0b301" (UID: "d5d81118-a04d-40a2-bfbc-8cdcb5e0b301"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:30:02 crc kubenswrapper[4885]: I1205 20:30:02.953129 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-kube-api-access-dll59" (OuterVolumeSpecName: "kube-api-access-dll59") pod "d5d81118-a04d-40a2-bfbc-8cdcb5e0b301" (UID: "d5d81118-a04d-40a2-bfbc-8cdcb5e0b301"). InnerVolumeSpecName "kube-api-access-dll59". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:30:02 crc kubenswrapper[4885]: I1205 20:30:02.959237 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d5d81118-a04d-40a2-bfbc-8cdcb5e0b301" (UID: "d5d81118-a04d-40a2-bfbc-8cdcb5e0b301"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:30:03 crc kubenswrapper[4885]: I1205 20:30:03.049126 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dll59\" (UniqueName: \"kubernetes.io/projected/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-kube-api-access-dll59\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:03 crc kubenswrapper[4885]: I1205 20:30:03.049155 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:03 crc kubenswrapper[4885]: I1205 20:30:03.049164 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:03 crc kubenswrapper[4885]: I1205 20:30:03.632734 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt" event={"ID":"d5d81118-a04d-40a2-bfbc-8cdcb5e0b301","Type":"ContainerDied","Data":"60a98dea21509bfc6c6694f1e4396f7ef7d6986fcde0363bd73c3279443b9a29"} Dec 05 20:30:03 crc kubenswrapper[4885]: I1205 20:30:03.632783 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60a98dea21509bfc6c6694f1e4396f7ef7d6986fcde0363bd73c3279443b9a29" Dec 05 20:30:03 crc kubenswrapper[4885]: I1205 20:30:03.632836 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt" Dec 05 20:30:10 crc kubenswrapper[4885]: E1205 20:30:10.995479 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04947505_ce67_49df_8945_fd6dce52c75d.slice/crio-af287baf9b5dc4bb23fe9d48473fc6230be56d752f8e7214a9a84710c9e29f4f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04947505_ce67_49df_8945_fd6dce52c75d.slice\": RecentStats: unable to find data in memory cache]" Dec 05 20:30:16 crc kubenswrapper[4885]: I1205 20:30:16.630840 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:30:16 crc kubenswrapper[4885]: I1205 20:30:16.631334 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:30:21 crc kubenswrapper[4885]: E1205 20:30:21.297871 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04947505_ce67_49df_8945_fd6dce52c75d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04947505_ce67_49df_8945_fd6dce52c75d.slice/crio-af287baf9b5dc4bb23fe9d48473fc6230be56d752f8e7214a9a84710c9e29f4f\": RecentStats: unable to find data in memory cache]" Dec 05 20:30:31 crc kubenswrapper[4885]: E1205 20:30:31.559122 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04947505_ce67_49df_8945_fd6dce52c75d.slice/crio-af287baf9b5dc4bb23fe9d48473fc6230be56d752f8e7214a9a84710c9e29f4f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04947505_ce67_49df_8945_fd6dce52c75d.slice\": RecentStats: unable to find data in memory cache]" Dec 05 20:30:35 crc kubenswrapper[4885]: E1205 20:30:35.218874 4885 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/f2eae0b453c6e0f10aac56be2246ee3f06b426dd4f24308b62ce9fa25212625d/diff" to get inode usage: stat /var/lib/containers/storage/overlay/f2eae0b453c6e0f10aac56be2246ee3f06b426dd4f24308b62ce9fa25212625d/diff: no such file or directory, extraDiskErr: Dec 05 20:30:46 crc kubenswrapper[4885]: I1205 20:30:46.631259 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:30:46 crc kubenswrapper[4885]: I1205 20:30:46.631907 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:30:53 crc kubenswrapper[4885]: I1205 20:30:53.128865 4885 scope.go:117] "RemoveContainer" containerID="d13a3eeadcd25a0137b9bb8825da963d72f583fe23066153e8266055f0b0ce9e" Dec 05 20:31:16 crc kubenswrapper[4885]: I1205 20:31:16.631170 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:31:16 crc kubenswrapper[4885]: I1205 20:31:16.632017 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:31:16 crc kubenswrapper[4885]: I1205 20:31:16.632155 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:31:16 crc kubenswrapper[4885]: I1205 20:31:16.633494 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba"} pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:31:16 crc kubenswrapper[4885]: I1205 20:31:16.633617 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" containerID="cri-o://00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" gracePeriod=600 Dec 05 20:31:16 crc kubenswrapper[4885]: E1205 20:31:16.761732 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:31:17 crc kubenswrapper[4885]: I1205 20:31:17.415445 4885 generic.go:334] "Generic (PLEG): container finished" podID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" exitCode=0 Dec 05 20:31:17 crc kubenswrapper[4885]: I1205 20:31:17.415506 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerDied","Data":"00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba"} Dec 05 20:31:17 crc kubenswrapper[4885]: I1205 20:31:17.415561 4885 scope.go:117] "RemoveContainer" containerID="91c26cde9f44964206a15bb12fc6d413d79858501fac35b74853db9d5b02ba34" Dec 05 20:31:17 crc kubenswrapper[4885]: I1205 20:31:17.416346 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:31:17 crc kubenswrapper[4885]: E1205 20:31:17.416744 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:31:30 crc kubenswrapper[4885]: I1205 20:31:30.173354 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:31:30 crc kubenswrapper[4885]: E1205 20:31:30.174476 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:31:39 crc kubenswrapper[4885]: I1205 20:31:39.395039 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-65plm"] Dec 05 20:31:39 crc kubenswrapper[4885]: E1205 20:31:39.395653 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d81118-a04d-40a2-bfbc-8cdcb5e0b301" containerName="collect-profiles" Dec 05 20:31:39 crc kubenswrapper[4885]: I1205 20:31:39.395665 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d81118-a04d-40a2-bfbc-8cdcb5e0b301" containerName="collect-profiles" Dec 05 20:31:39 crc kubenswrapper[4885]: I1205 20:31:39.395894 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5d81118-a04d-40a2-bfbc-8cdcb5e0b301" containerName="collect-profiles" Dec 05 20:31:39 crc kubenswrapper[4885]: I1205 20:31:39.397195 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65plm" Dec 05 20:31:39 crc kubenswrapper[4885]: I1205 20:31:39.411916 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-65plm"] Dec 05 20:31:39 crc kubenswrapper[4885]: I1205 20:31:39.508549 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccfbf\" (UniqueName: \"kubernetes.io/projected/f0443767-ff82-48a9-8fc4-c981ebe6ebac-kube-api-access-ccfbf\") pod \"certified-operators-65plm\" (UID: \"f0443767-ff82-48a9-8fc4-c981ebe6ebac\") " pod="openshift-marketplace/certified-operators-65plm" Dec 05 20:31:39 crc kubenswrapper[4885]: I1205 20:31:39.508634 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0443767-ff82-48a9-8fc4-c981ebe6ebac-utilities\") pod \"certified-operators-65plm\" (UID: \"f0443767-ff82-48a9-8fc4-c981ebe6ebac\") " pod="openshift-marketplace/certified-operators-65plm" Dec 05 20:31:39 crc kubenswrapper[4885]: I1205 20:31:39.508750 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0443767-ff82-48a9-8fc4-c981ebe6ebac-catalog-content\") pod \"certified-operators-65plm\" (UID: \"f0443767-ff82-48a9-8fc4-c981ebe6ebac\") " pod="openshift-marketplace/certified-operators-65plm" Dec 05 20:31:39 crc kubenswrapper[4885]: I1205 20:31:39.610315 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccfbf\" (UniqueName: \"kubernetes.io/projected/f0443767-ff82-48a9-8fc4-c981ebe6ebac-kube-api-access-ccfbf\") pod \"certified-operators-65plm\" (UID: \"f0443767-ff82-48a9-8fc4-c981ebe6ebac\") " pod="openshift-marketplace/certified-operators-65plm" Dec 05 20:31:39 crc kubenswrapper[4885]: I1205 20:31:39.610655 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0443767-ff82-48a9-8fc4-c981ebe6ebac-utilities\") pod \"certified-operators-65plm\" (UID: \"f0443767-ff82-48a9-8fc4-c981ebe6ebac\") " pod="openshift-marketplace/certified-operators-65plm" Dec 05 20:31:39 crc kubenswrapper[4885]: I1205 20:31:39.610748 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0443767-ff82-48a9-8fc4-c981ebe6ebac-catalog-content\") pod \"certified-operators-65plm\" (UID: \"f0443767-ff82-48a9-8fc4-c981ebe6ebac\") " pod="openshift-marketplace/certified-operators-65plm" Dec 05 20:31:39 crc kubenswrapper[4885]: I1205 20:31:39.611310 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0443767-ff82-48a9-8fc4-c981ebe6ebac-catalog-content\") pod \"certified-operators-65plm\" (UID: \"f0443767-ff82-48a9-8fc4-c981ebe6ebac\") " pod="openshift-marketplace/certified-operators-65plm" Dec 05 20:31:39 crc kubenswrapper[4885]: I1205 20:31:39.611885 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0443767-ff82-48a9-8fc4-c981ebe6ebac-utilities\") pod \"certified-operators-65plm\" (UID: \"f0443767-ff82-48a9-8fc4-c981ebe6ebac\") " pod="openshift-marketplace/certified-operators-65plm" Dec 05 20:31:39 crc kubenswrapper[4885]: I1205 20:31:39.641395 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccfbf\" (UniqueName: \"kubernetes.io/projected/f0443767-ff82-48a9-8fc4-c981ebe6ebac-kube-api-access-ccfbf\") pod \"certified-operators-65plm\" (UID: \"f0443767-ff82-48a9-8fc4-c981ebe6ebac\") " pod="openshift-marketplace/certified-operators-65plm" Dec 05 20:31:39 crc kubenswrapper[4885]: I1205 20:31:39.714843 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65plm" Dec 05 20:31:40 crc kubenswrapper[4885]: I1205 20:31:40.209192 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-65plm"] Dec 05 20:31:40 crc kubenswrapper[4885]: I1205 20:31:40.650568 4885 generic.go:334] "Generic (PLEG): container finished" podID="f0443767-ff82-48a9-8fc4-c981ebe6ebac" containerID="ac1c70218cc0cf662078afcf8ee123278c9f5e406d22a32441e269db103fbbdb" exitCode=0 Dec 05 20:31:40 crc kubenswrapper[4885]: I1205 20:31:40.650833 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65plm" event={"ID":"f0443767-ff82-48a9-8fc4-c981ebe6ebac","Type":"ContainerDied","Data":"ac1c70218cc0cf662078afcf8ee123278c9f5e406d22a32441e269db103fbbdb"} Dec 05 20:31:40 crc kubenswrapper[4885]: I1205 20:31:40.650858 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65plm" event={"ID":"f0443767-ff82-48a9-8fc4-c981ebe6ebac","Type":"ContainerStarted","Data":"a1e850cf42e069c911100e95aa19490e1af3c1bff6b6420c97f38a0ba1874a71"} Dec 05 20:31:44 crc kubenswrapper[4885]: I1205 20:31:44.687731 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65plm" event={"ID":"f0443767-ff82-48a9-8fc4-c981ebe6ebac","Type":"ContainerStarted","Data":"71b5695b2553161065eedd19811501ed22342a2dcf706cbbec5ba6294e10cd23"} Dec 05 20:31:45 crc kubenswrapper[4885]: I1205 20:31:45.179512 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:31:45 crc kubenswrapper[4885]: E1205 20:31:45.179999 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:31:45 crc kubenswrapper[4885]: I1205 20:31:45.700553 4885 generic.go:334] "Generic (PLEG): container finished" podID="f0443767-ff82-48a9-8fc4-c981ebe6ebac" containerID="71b5695b2553161065eedd19811501ed22342a2dcf706cbbec5ba6294e10cd23" exitCode=0 Dec 05 20:31:45 crc kubenswrapper[4885]: I1205 20:31:45.700645 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65plm" event={"ID":"f0443767-ff82-48a9-8fc4-c981ebe6ebac","Type":"ContainerDied","Data":"71b5695b2553161065eedd19811501ed22342a2dcf706cbbec5ba6294e10cd23"} Dec 05 20:31:46 crc kubenswrapper[4885]: I1205 20:31:46.711202 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65plm" event={"ID":"f0443767-ff82-48a9-8fc4-c981ebe6ebac","Type":"ContainerStarted","Data":"f48d6b594b9cb5d05f144b7cf4b2cb9d9a85be60c381b0e5a52ee1515f62f48e"} Dec 05 20:31:46 crc kubenswrapper[4885]: I1205 20:31:46.726790 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-65plm" podStartSLOduration=2.224823165 podStartE2EDuration="7.726772666s" podCreationTimestamp="2025-12-05 20:31:39 +0000 UTC" firstStartedPulling="2025-12-05 20:31:40.652794176 +0000 UTC m=+1565.949609837" lastFinishedPulling="2025-12-05 20:31:46.154743677 +0000 UTC m=+1571.451559338" observedRunningTime="2025-12-05 20:31:46.725422774 +0000 UTC m=+1572.022238465" watchObservedRunningTime="2025-12-05 20:31:46.726772666 +0000 UTC m=+1572.023588347" Dec 05 20:31:49 crc kubenswrapper[4885]: I1205 20:31:49.715697 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-65plm" Dec 05 20:31:49 crc kubenswrapper[4885]: I1205 20:31:49.716255 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-65plm" Dec 05 20:31:49 crc kubenswrapper[4885]: I1205 20:31:49.790738 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-65plm" Dec 05 20:31:57 crc kubenswrapper[4885]: I1205 20:31:57.131992 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r5j6h"] Dec 05 20:31:57 crc kubenswrapper[4885]: I1205 20:31:57.134702 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:31:57 crc kubenswrapper[4885]: I1205 20:31:57.146710 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r5j6h"] Dec 05 20:31:57 crc kubenswrapper[4885]: I1205 20:31:57.166921 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa1b375-8d14-43c0-a66d-e83f36e8bf12-catalog-content\") pod \"community-operators-r5j6h\" (UID: \"faa1b375-8d14-43c0-a66d-e83f36e8bf12\") " pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:31:57 crc kubenswrapper[4885]: I1205 20:31:57.167011 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9272\" (UniqueName: \"kubernetes.io/projected/faa1b375-8d14-43c0-a66d-e83f36e8bf12-kube-api-access-m9272\") pod \"community-operators-r5j6h\" (UID: \"faa1b375-8d14-43c0-a66d-e83f36e8bf12\") " pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:31:57 crc kubenswrapper[4885]: I1205 20:31:57.167074 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa1b375-8d14-43c0-a66d-e83f36e8bf12-utilities\") pod \"community-operators-r5j6h\" (UID: \"faa1b375-8d14-43c0-a66d-e83f36e8bf12\") " pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:31:57 crc kubenswrapper[4885]: I1205 20:31:57.268191 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa1b375-8d14-43c0-a66d-e83f36e8bf12-catalog-content\") pod \"community-operators-r5j6h\" (UID: \"faa1b375-8d14-43c0-a66d-e83f36e8bf12\") " pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:31:57 crc kubenswrapper[4885]: I1205 20:31:57.268286 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9272\" (UniqueName: \"kubernetes.io/projected/faa1b375-8d14-43c0-a66d-e83f36e8bf12-kube-api-access-m9272\") pod \"community-operators-r5j6h\" (UID: \"faa1b375-8d14-43c0-a66d-e83f36e8bf12\") " pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:31:57 crc kubenswrapper[4885]: I1205 20:31:57.268325 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa1b375-8d14-43c0-a66d-e83f36e8bf12-utilities\") pod \"community-operators-r5j6h\" (UID: \"faa1b375-8d14-43c0-a66d-e83f36e8bf12\") " pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:31:57 crc kubenswrapper[4885]: I1205 20:31:57.268889 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa1b375-8d14-43c0-a66d-e83f36e8bf12-utilities\") pod \"community-operators-r5j6h\" (UID: \"faa1b375-8d14-43c0-a66d-e83f36e8bf12\") " pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:31:57 crc kubenswrapper[4885]: I1205 20:31:57.270692 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa1b375-8d14-43c0-a66d-e83f36e8bf12-catalog-content\") pod \"community-operators-r5j6h\" (UID: \"faa1b375-8d14-43c0-a66d-e83f36e8bf12\") " pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:31:57 crc kubenswrapper[4885]: I1205 20:31:57.297918 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9272\" (UniqueName: \"kubernetes.io/projected/faa1b375-8d14-43c0-a66d-e83f36e8bf12-kube-api-access-m9272\") pod \"community-operators-r5j6h\" (UID: \"faa1b375-8d14-43c0-a66d-e83f36e8bf12\") " pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:31:57 crc kubenswrapper[4885]: I1205 20:31:57.456904 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:31:58 crc kubenswrapper[4885]: I1205 20:31:58.031118 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r5j6h"] Dec 05 20:31:58 crc kubenswrapper[4885]: I1205 20:31:58.830816 4885 generic.go:334] "Generic (PLEG): container finished" podID="faa1b375-8d14-43c0-a66d-e83f36e8bf12" containerID="d60817f95561880a72979b194c803942124d57cb2f8c20e68a495bc1d175cfa4" exitCode=0 Dec 05 20:31:58 crc kubenswrapper[4885]: I1205 20:31:58.830908 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r5j6h" event={"ID":"faa1b375-8d14-43c0-a66d-e83f36e8bf12","Type":"ContainerDied","Data":"d60817f95561880a72979b194c803942124d57cb2f8c20e68a495bc1d175cfa4"} Dec 05 20:31:58 crc kubenswrapper[4885]: I1205 20:31:58.831114 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r5j6h" event={"ID":"faa1b375-8d14-43c0-a66d-e83f36e8bf12","Type":"ContainerStarted","Data":"758b7edee8f11df5445638c0b3d84ec3a681392d4f93e8782c674d0ae228744b"} Dec 05 20:31:59 crc kubenswrapper[4885]: I1205 20:31:59.790111 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-65plm" Dec 05 20:31:59 crc kubenswrapper[4885]: I1205 20:31:59.853000 4885 generic.go:334] "Generic (PLEG): container finished" podID="faa1b375-8d14-43c0-a66d-e83f36e8bf12" containerID="2b1fa5b85393c7d85d9532239785a5e727d6c71d86a5b97b3613431ef4846a6f" exitCode=0 Dec 05 20:31:59 crc kubenswrapper[4885]: I1205 20:31:59.853071 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r5j6h" event={"ID":"faa1b375-8d14-43c0-a66d-e83f36e8bf12","Type":"ContainerDied","Data":"2b1fa5b85393c7d85d9532239785a5e727d6c71d86a5b97b3613431ef4846a6f"} Dec 05 20:32:00 crc kubenswrapper[4885]: I1205 20:32:00.173446 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:32:00 crc kubenswrapper[4885]: E1205 20:32:00.173740 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:32:00 crc kubenswrapper[4885]: I1205 20:32:00.864954 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r5j6h" event={"ID":"faa1b375-8d14-43c0-a66d-e83f36e8bf12","Type":"ContainerStarted","Data":"1d9b0524baeeffa4a78702481e92ce4013662d0b44f5f25711ad651cf9aa8fa8"} Dec 05 20:32:00 crc kubenswrapper[4885]: I1205 20:32:00.897375 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r5j6h" podStartSLOduration=2.427847814 podStartE2EDuration="3.897354401s" podCreationTimestamp="2025-12-05 20:31:57 +0000 UTC" firstStartedPulling="2025-12-05 20:31:58.833242439 +0000 UTC m=+1584.130058100" lastFinishedPulling="2025-12-05 20:32:00.302749026 +0000 UTC m=+1585.599564687" observedRunningTime="2025-12-05 20:32:00.890682763 +0000 UTC m=+1586.187498454" watchObservedRunningTime="2025-12-05 20:32:00.897354401 +0000 UTC m=+1586.194170072" Dec 05 20:32:01 crc kubenswrapper[4885]: I1205 20:32:01.752921 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-65plm"] Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.115922 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lv47n"] Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.116446 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lv47n" podUID="012f80db-3d51-4336-94d3-9a54c642d7db" containerName="registry-server" containerID="cri-o://1b73f8b05ab7e73c9a14dcfdba984c2819e1d2d5015cfc752118b8b9b5f6416c" gracePeriod=2 Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.647393 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.691703 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012f80db-3d51-4336-94d3-9a54c642d7db-catalog-content\") pod \"012f80db-3d51-4336-94d3-9a54c642d7db\" (UID: \"012f80db-3d51-4336-94d3-9a54c642d7db\") " Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.691755 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012f80db-3d51-4336-94d3-9a54c642d7db-utilities\") pod \"012f80db-3d51-4336-94d3-9a54c642d7db\" (UID: \"012f80db-3d51-4336-94d3-9a54c642d7db\") " Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.691866 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dddtw\" (UniqueName: \"kubernetes.io/projected/012f80db-3d51-4336-94d3-9a54c642d7db-kube-api-access-dddtw\") pod \"012f80db-3d51-4336-94d3-9a54c642d7db\" (UID: \"012f80db-3d51-4336-94d3-9a54c642d7db\") " Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.692816 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/012f80db-3d51-4336-94d3-9a54c642d7db-utilities" (OuterVolumeSpecName: "utilities") pod "012f80db-3d51-4336-94d3-9a54c642d7db" (UID: "012f80db-3d51-4336-94d3-9a54c642d7db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.711681 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/012f80db-3d51-4336-94d3-9a54c642d7db-kube-api-access-dddtw" (OuterVolumeSpecName: "kube-api-access-dddtw") pod "012f80db-3d51-4336-94d3-9a54c642d7db" (UID: "012f80db-3d51-4336-94d3-9a54c642d7db"). InnerVolumeSpecName "kube-api-access-dddtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.748427 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/012f80db-3d51-4336-94d3-9a54c642d7db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "012f80db-3d51-4336-94d3-9a54c642d7db" (UID: "012f80db-3d51-4336-94d3-9a54c642d7db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.797223 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012f80db-3d51-4336-94d3-9a54c642d7db-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.797459 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012f80db-3d51-4336-94d3-9a54c642d7db-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.797536 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dddtw\" (UniqueName: \"kubernetes.io/projected/012f80db-3d51-4336-94d3-9a54c642d7db-kube-api-access-dddtw\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.887083 4885 generic.go:334] "Generic (PLEG): container finished" podID="012f80db-3d51-4336-94d3-9a54c642d7db" containerID="1b73f8b05ab7e73c9a14dcfdba984c2819e1d2d5015cfc752118b8b9b5f6416c" exitCode=0 Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.887137 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lv47n" Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.887169 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lv47n" event={"ID":"012f80db-3d51-4336-94d3-9a54c642d7db","Type":"ContainerDied","Data":"1b73f8b05ab7e73c9a14dcfdba984c2819e1d2d5015cfc752118b8b9b5f6416c"} Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.887566 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lv47n" event={"ID":"012f80db-3d51-4336-94d3-9a54c642d7db","Type":"ContainerDied","Data":"4836d8c7bd384c48e7aa25f93bf537c4be236a53e1d6e3ee4e5a980af77a6060"} Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.887608 4885 scope.go:117] "RemoveContainer" containerID="1b73f8b05ab7e73c9a14dcfdba984c2819e1d2d5015cfc752118b8b9b5f6416c" Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.930138 4885 scope.go:117] "RemoveContainer" containerID="109df54d2d7e7b52eedafc9a35ed820858ef3b7012b65d377b91d5f99680a616" Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.941697 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lv47n"] Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.952532 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lv47n"] Dec 05 20:32:02 crc kubenswrapper[4885]: I1205 20:32:02.991943 4885 scope.go:117] "RemoveContainer" containerID="68917757e92285ce4bbc7aa790427967f4cac016bce8be8a30860ddfe0dbee3e" Dec 05 20:32:03 crc kubenswrapper[4885]: I1205 20:32:03.021582 4885 scope.go:117] "RemoveContainer" containerID="1b73f8b05ab7e73c9a14dcfdba984c2819e1d2d5015cfc752118b8b9b5f6416c" Dec 05 20:32:03 crc kubenswrapper[4885]: E1205 20:32:03.023405 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b73f8b05ab7e73c9a14dcfdba984c2819e1d2d5015cfc752118b8b9b5f6416c\": container with ID starting with 1b73f8b05ab7e73c9a14dcfdba984c2819e1d2d5015cfc752118b8b9b5f6416c not found: ID does not exist" containerID="1b73f8b05ab7e73c9a14dcfdba984c2819e1d2d5015cfc752118b8b9b5f6416c" Dec 05 20:32:03 crc kubenswrapper[4885]: I1205 20:32:03.023438 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b73f8b05ab7e73c9a14dcfdba984c2819e1d2d5015cfc752118b8b9b5f6416c"} err="failed to get container status \"1b73f8b05ab7e73c9a14dcfdba984c2819e1d2d5015cfc752118b8b9b5f6416c\": rpc error: code = NotFound desc = could not find container \"1b73f8b05ab7e73c9a14dcfdba984c2819e1d2d5015cfc752118b8b9b5f6416c\": container with ID starting with 1b73f8b05ab7e73c9a14dcfdba984c2819e1d2d5015cfc752118b8b9b5f6416c not found: ID does not exist" Dec 05 20:32:03 crc kubenswrapper[4885]: I1205 20:32:03.023462 4885 scope.go:117] "RemoveContainer" containerID="109df54d2d7e7b52eedafc9a35ed820858ef3b7012b65d377b91d5f99680a616" Dec 05 20:32:03 crc kubenswrapper[4885]: E1205 20:32:03.023812 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"109df54d2d7e7b52eedafc9a35ed820858ef3b7012b65d377b91d5f99680a616\": container with ID starting with 109df54d2d7e7b52eedafc9a35ed820858ef3b7012b65d377b91d5f99680a616 not found: ID does not exist" containerID="109df54d2d7e7b52eedafc9a35ed820858ef3b7012b65d377b91d5f99680a616" Dec 05 20:32:03 crc kubenswrapper[4885]: I1205 20:32:03.023836 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109df54d2d7e7b52eedafc9a35ed820858ef3b7012b65d377b91d5f99680a616"} err="failed to get container status \"109df54d2d7e7b52eedafc9a35ed820858ef3b7012b65d377b91d5f99680a616\": rpc error: code = NotFound desc = could not find container \"109df54d2d7e7b52eedafc9a35ed820858ef3b7012b65d377b91d5f99680a616\": container with ID starting with 109df54d2d7e7b52eedafc9a35ed820858ef3b7012b65d377b91d5f99680a616 not found: ID does not exist" Dec 05 20:32:03 crc kubenswrapper[4885]: I1205 20:32:03.023852 4885 scope.go:117] "RemoveContainer" containerID="68917757e92285ce4bbc7aa790427967f4cac016bce8be8a30860ddfe0dbee3e" Dec 05 20:32:03 crc kubenswrapper[4885]: E1205 20:32:03.024044 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68917757e92285ce4bbc7aa790427967f4cac016bce8be8a30860ddfe0dbee3e\": container with ID starting with 68917757e92285ce4bbc7aa790427967f4cac016bce8be8a30860ddfe0dbee3e not found: ID does not exist" containerID="68917757e92285ce4bbc7aa790427967f4cac016bce8be8a30860ddfe0dbee3e" Dec 05 20:32:03 crc kubenswrapper[4885]: I1205 20:32:03.024068 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68917757e92285ce4bbc7aa790427967f4cac016bce8be8a30860ddfe0dbee3e"} err="failed to get container status \"68917757e92285ce4bbc7aa790427967f4cac016bce8be8a30860ddfe0dbee3e\": rpc error: code = NotFound desc = could not find container \"68917757e92285ce4bbc7aa790427967f4cac016bce8be8a30860ddfe0dbee3e\": container with ID starting with 68917757e92285ce4bbc7aa790427967f4cac016bce8be8a30860ddfe0dbee3e not found: ID does not exist" Dec 05 20:32:03 crc kubenswrapper[4885]: I1205 20:32:03.182541 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="012f80db-3d51-4336-94d3-9a54c642d7db" path="/var/lib/kubelet/pods/012f80db-3d51-4336-94d3-9a54c642d7db/volumes" Dec 05 20:32:07 crc kubenswrapper[4885]: I1205 20:32:07.457743 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:32:07 crc kubenswrapper[4885]: I1205 20:32:07.458268 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:32:07 crc kubenswrapper[4885]: I1205 20:32:07.506476 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:32:08 crc kubenswrapper[4885]: I1205 20:32:08.262959 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:32:09 crc kubenswrapper[4885]: I1205 20:32:09.114113 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r5j6h"] Dec 05 20:32:10 crc kubenswrapper[4885]: I1205 20:32:10.209443 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r5j6h" podUID="faa1b375-8d14-43c0-a66d-e83f36e8bf12" containerName="registry-server" containerID="cri-o://1d9b0524baeeffa4a78702481e92ce4013662d0b44f5f25711ad651cf9aa8fa8" gracePeriod=2 Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.210937 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.218341 4885 generic.go:334] "Generic (PLEG): container finished" podID="faa1b375-8d14-43c0-a66d-e83f36e8bf12" containerID="1d9b0524baeeffa4a78702481e92ce4013662d0b44f5f25711ad651cf9aa8fa8" exitCode=0 Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.218377 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r5j6h" event={"ID":"faa1b375-8d14-43c0-a66d-e83f36e8bf12","Type":"ContainerDied","Data":"1d9b0524baeeffa4a78702481e92ce4013662d0b44f5f25711ad651cf9aa8fa8"} Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.218386 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r5j6h" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.218400 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r5j6h" event={"ID":"faa1b375-8d14-43c0-a66d-e83f36e8bf12","Type":"ContainerDied","Data":"758b7edee8f11df5445638c0b3d84ec3a681392d4f93e8782c674d0ae228744b"} Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.218419 4885 scope.go:117] "RemoveContainer" containerID="1d9b0524baeeffa4a78702481e92ce4013662d0b44f5f25711ad651cf9aa8fa8" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.236809 4885 scope.go:117] "RemoveContainer" containerID="2b1fa5b85393c7d85d9532239785a5e727d6c71d86a5b97b3613431ef4846a6f" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.266136 4885 scope.go:117] "RemoveContainer" containerID="d60817f95561880a72979b194c803942124d57cb2f8c20e68a495bc1d175cfa4" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.275881 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa1b375-8d14-43c0-a66d-e83f36e8bf12-utilities\") pod \"faa1b375-8d14-43c0-a66d-e83f36e8bf12\" (UID: \"faa1b375-8d14-43c0-a66d-e83f36e8bf12\") " Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.275945 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa1b375-8d14-43c0-a66d-e83f36e8bf12-catalog-content\") pod \"faa1b375-8d14-43c0-a66d-e83f36e8bf12\" (UID: \"faa1b375-8d14-43c0-a66d-e83f36e8bf12\") " Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.276067 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9272\" (UniqueName: \"kubernetes.io/projected/faa1b375-8d14-43c0-a66d-e83f36e8bf12-kube-api-access-m9272\") pod \"faa1b375-8d14-43c0-a66d-e83f36e8bf12\" (UID: \"faa1b375-8d14-43c0-a66d-e83f36e8bf12\") " Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.276868 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa1b375-8d14-43c0-a66d-e83f36e8bf12-utilities" (OuterVolumeSpecName: "utilities") pod "faa1b375-8d14-43c0-a66d-e83f36e8bf12" (UID: "faa1b375-8d14-43c0-a66d-e83f36e8bf12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.282368 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faa1b375-8d14-43c0-a66d-e83f36e8bf12-kube-api-access-m9272" (OuterVolumeSpecName: "kube-api-access-m9272") pod "faa1b375-8d14-43c0-a66d-e83f36e8bf12" (UID: "faa1b375-8d14-43c0-a66d-e83f36e8bf12"). InnerVolumeSpecName "kube-api-access-m9272". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.330305 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa1b375-8d14-43c0-a66d-e83f36e8bf12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "faa1b375-8d14-43c0-a66d-e83f36e8bf12" (UID: "faa1b375-8d14-43c0-a66d-e83f36e8bf12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.349970 4885 scope.go:117] "RemoveContainer" containerID="1d9b0524baeeffa4a78702481e92ce4013662d0b44f5f25711ad651cf9aa8fa8" Dec 05 20:32:11 crc kubenswrapper[4885]: E1205 20:32:11.350369 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d9b0524baeeffa4a78702481e92ce4013662d0b44f5f25711ad651cf9aa8fa8\": container with ID starting with 1d9b0524baeeffa4a78702481e92ce4013662d0b44f5f25711ad651cf9aa8fa8 not found: ID does not exist" containerID="1d9b0524baeeffa4a78702481e92ce4013662d0b44f5f25711ad651cf9aa8fa8" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.350398 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9b0524baeeffa4a78702481e92ce4013662d0b44f5f25711ad651cf9aa8fa8"} err="failed to get container status \"1d9b0524baeeffa4a78702481e92ce4013662d0b44f5f25711ad651cf9aa8fa8\": rpc error: code = NotFound desc = could not find container \"1d9b0524baeeffa4a78702481e92ce4013662d0b44f5f25711ad651cf9aa8fa8\": container with ID starting with 1d9b0524baeeffa4a78702481e92ce4013662d0b44f5f25711ad651cf9aa8fa8 not found: ID does not exist" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.350418 4885 scope.go:117] "RemoveContainer" containerID="2b1fa5b85393c7d85d9532239785a5e727d6c71d86a5b97b3613431ef4846a6f" Dec 05 20:32:11 crc kubenswrapper[4885]: E1205 20:32:11.350835 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b1fa5b85393c7d85d9532239785a5e727d6c71d86a5b97b3613431ef4846a6f\": container with ID starting with 2b1fa5b85393c7d85d9532239785a5e727d6c71d86a5b97b3613431ef4846a6f not found: ID does not exist" containerID="2b1fa5b85393c7d85d9532239785a5e727d6c71d86a5b97b3613431ef4846a6f" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.350887 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b1fa5b85393c7d85d9532239785a5e727d6c71d86a5b97b3613431ef4846a6f"} err="failed to get container status \"2b1fa5b85393c7d85d9532239785a5e727d6c71d86a5b97b3613431ef4846a6f\": rpc error: code = NotFound desc = could not find container \"2b1fa5b85393c7d85d9532239785a5e727d6c71d86a5b97b3613431ef4846a6f\": container with ID starting with 2b1fa5b85393c7d85d9532239785a5e727d6c71d86a5b97b3613431ef4846a6f not found: ID does not exist" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.350919 4885 scope.go:117] "RemoveContainer" containerID="d60817f95561880a72979b194c803942124d57cb2f8c20e68a495bc1d175cfa4" Dec 05 20:32:11 crc kubenswrapper[4885]: E1205 20:32:11.351572 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d60817f95561880a72979b194c803942124d57cb2f8c20e68a495bc1d175cfa4\": container with ID starting with d60817f95561880a72979b194c803942124d57cb2f8c20e68a495bc1d175cfa4 not found: ID does not exist" containerID="d60817f95561880a72979b194c803942124d57cb2f8c20e68a495bc1d175cfa4" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.351603 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d60817f95561880a72979b194c803942124d57cb2f8c20e68a495bc1d175cfa4"} err="failed to get container status \"d60817f95561880a72979b194c803942124d57cb2f8c20e68a495bc1d175cfa4\": rpc error: code = NotFound desc = could not find container \"d60817f95561880a72979b194c803942124d57cb2f8c20e68a495bc1d175cfa4\": container with ID starting with d60817f95561880a72979b194c803942124d57cb2f8c20e68a495bc1d175cfa4 not found: ID does not exist" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.378261 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9272\" (UniqueName: \"kubernetes.io/projected/faa1b375-8d14-43c0-a66d-e83f36e8bf12-kube-api-access-m9272\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.378300 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa1b375-8d14-43c0-a66d-e83f36e8bf12-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.378310 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa1b375-8d14-43c0-a66d-e83f36e8bf12-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.556169 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r5j6h"] Dec 05 20:32:11 crc kubenswrapper[4885]: I1205 20:32:11.564778 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r5j6h"] Dec 05 20:32:12 crc kubenswrapper[4885]: I1205 20:32:12.172848 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:32:12 crc kubenswrapper[4885]: E1205 20:32:12.173676 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:32:13 crc kubenswrapper[4885]: I1205 20:32:13.186865 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faa1b375-8d14-43c0-a66d-e83f36e8bf12" path="/var/lib/kubelet/pods/faa1b375-8d14-43c0-a66d-e83f36e8bf12/volumes" Dec 05 20:32:24 crc kubenswrapper[4885]: I1205 20:32:24.172824 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:32:24 crc kubenswrapper[4885]: E1205 20:32:24.174352 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:32:25 crc kubenswrapper[4885]: I1205 20:32:25.393900 4885 generic.go:334] "Generic (PLEG): container finished" podID="54bae71b-4af1-49b5-a41b-58e6aafd26ca" containerID="b4cc0f027ba466b0b5ab06d395b77d6e53abde1bb662ee8e59bb8b731a8aa383" exitCode=0 Dec 05 20:32:25 crc kubenswrapper[4885]: I1205 20:32:25.393999 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" event={"ID":"54bae71b-4af1-49b5-a41b-58e6aafd26ca","Type":"ContainerDied","Data":"b4cc0f027ba466b0b5ab06d395b77d6e53abde1bb662ee8e59bb8b731a8aa383"} Dec 05 20:32:26 crc kubenswrapper[4885]: I1205 20:32:26.849524 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" Dec 05 20:32:26 crc kubenswrapper[4885]: I1205 20:32:26.972713 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-bootstrap-combined-ca-bundle\") pod \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\" (UID: \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\") " Dec 05 20:32:26 crc kubenswrapper[4885]: I1205 20:32:26.972783 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9fkb\" (UniqueName: \"kubernetes.io/projected/54bae71b-4af1-49b5-a41b-58e6aafd26ca-kube-api-access-m9fkb\") pod \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\" (UID: \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\") " Dec 05 20:32:26 crc kubenswrapper[4885]: I1205 20:32:26.972875 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-ssh-key\") pod \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\" (UID: \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\") " Dec 05 20:32:26 crc kubenswrapper[4885]: I1205 20:32:26.972911 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-inventory\") pod \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\" (UID: \"54bae71b-4af1-49b5-a41b-58e6aafd26ca\") " Dec 05 20:32:26 crc kubenswrapper[4885]: I1205 20:32:26.981732 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "54bae71b-4af1-49b5-a41b-58e6aafd26ca" (UID: "54bae71b-4af1-49b5-a41b-58e6aafd26ca"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:26 crc kubenswrapper[4885]: I1205 20:32:26.983274 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54bae71b-4af1-49b5-a41b-58e6aafd26ca-kube-api-access-m9fkb" (OuterVolumeSpecName: "kube-api-access-m9fkb") pod "54bae71b-4af1-49b5-a41b-58e6aafd26ca" (UID: "54bae71b-4af1-49b5-a41b-58e6aafd26ca"). InnerVolumeSpecName "kube-api-access-m9fkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.007005 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "54bae71b-4af1-49b5-a41b-58e6aafd26ca" (UID: "54bae71b-4af1-49b5-a41b-58e6aafd26ca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.033330 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-inventory" (OuterVolumeSpecName: "inventory") pod "54bae71b-4af1-49b5-a41b-58e6aafd26ca" (UID: "54bae71b-4af1-49b5-a41b-58e6aafd26ca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.075754 4885 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.075790 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9fkb\" (UniqueName: \"kubernetes.io/projected/54bae71b-4af1-49b5-a41b-58e6aafd26ca-kube-api-access-m9fkb\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.075800 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.075809 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54bae71b-4af1-49b5-a41b-58e6aafd26ca-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.411873 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" event={"ID":"54bae71b-4af1-49b5-a41b-58e6aafd26ca","Type":"ContainerDied","Data":"a7fa112c93bfbf516893c5f2f0a7d96344e8dd908f2a41edd929cf4e9133b406"} Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.411914 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7fa112c93bfbf516893c5f2f0a7d96344e8dd908f2a41edd929cf4e9133b406" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.411968 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.527484 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9"] Dec 05 20:32:27 crc kubenswrapper[4885]: E1205 20:32:27.527838 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="012f80db-3d51-4336-94d3-9a54c642d7db" containerName="extract-utilities" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.527853 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="012f80db-3d51-4336-94d3-9a54c642d7db" containerName="extract-utilities" Dec 05 20:32:27 crc kubenswrapper[4885]: E1205 20:32:27.527872 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54bae71b-4af1-49b5-a41b-58e6aafd26ca" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.527882 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="54bae71b-4af1-49b5-a41b-58e6aafd26ca" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 20:32:27 crc kubenswrapper[4885]: E1205 20:32:27.527899 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa1b375-8d14-43c0-a66d-e83f36e8bf12" containerName="registry-server" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.527906 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa1b375-8d14-43c0-a66d-e83f36e8bf12" containerName="registry-server" Dec 05 20:32:27 crc kubenswrapper[4885]: E1205 20:32:27.527916 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="012f80db-3d51-4336-94d3-9a54c642d7db" containerName="extract-content" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.527922 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="012f80db-3d51-4336-94d3-9a54c642d7db" containerName="extract-content" Dec 05 20:32:27 crc kubenswrapper[4885]: E1205 20:32:27.527939 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa1b375-8d14-43c0-a66d-e83f36e8bf12" containerName="extract-utilities" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.527944 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa1b375-8d14-43c0-a66d-e83f36e8bf12" containerName="extract-utilities" Dec 05 20:32:27 crc kubenswrapper[4885]: E1205 20:32:27.527957 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="012f80db-3d51-4336-94d3-9a54c642d7db" containerName="registry-server" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.527962 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="012f80db-3d51-4336-94d3-9a54c642d7db" containerName="registry-server" Dec 05 20:32:27 crc kubenswrapper[4885]: E1205 20:32:27.527982 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa1b375-8d14-43c0-a66d-e83f36e8bf12" containerName="extract-content" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.527988 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa1b375-8d14-43c0-a66d-e83f36e8bf12" containerName="extract-content" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.528264 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa1b375-8d14-43c0-a66d-e83f36e8bf12" containerName="registry-server" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.528290 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="012f80db-3d51-4336-94d3-9a54c642d7db" containerName="registry-server" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.528298 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="54bae71b-4af1-49b5-a41b-58e6aafd26ca" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.528897 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.531295 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.531833 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.531851 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.531836 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.539578 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9"] Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.685532 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgqzm\" (UniqueName: \"kubernetes.io/projected/a16820a2-be4e-45d6-bcef-91810571b95f-kube-api-access-lgqzm\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9\" (UID: \"a16820a2-be4e-45d6-bcef-91810571b95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.685602 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a16820a2-be4e-45d6-bcef-91810571b95f-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9\" (UID: \"a16820a2-be4e-45d6-bcef-91810571b95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.685647 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a16820a2-be4e-45d6-bcef-91810571b95f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9\" (UID: \"a16820a2-be4e-45d6-bcef-91810571b95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.787478 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgqzm\" (UniqueName: \"kubernetes.io/projected/a16820a2-be4e-45d6-bcef-91810571b95f-kube-api-access-lgqzm\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9\" (UID: \"a16820a2-be4e-45d6-bcef-91810571b95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.787553 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a16820a2-be4e-45d6-bcef-91810571b95f-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9\" (UID: \"a16820a2-be4e-45d6-bcef-91810571b95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.787602 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a16820a2-be4e-45d6-bcef-91810571b95f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9\" (UID: \"a16820a2-be4e-45d6-bcef-91810571b95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.792807 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a16820a2-be4e-45d6-bcef-91810571b95f-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9\" (UID: \"a16820a2-be4e-45d6-bcef-91810571b95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.795732 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a16820a2-be4e-45d6-bcef-91810571b95f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9\" (UID: \"a16820a2-be4e-45d6-bcef-91810571b95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.806392 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgqzm\" (UniqueName: \"kubernetes.io/projected/a16820a2-be4e-45d6-bcef-91810571b95f-kube-api-access-lgqzm\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9\" (UID: \"a16820a2-be4e-45d6-bcef-91810571b95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" Dec 05 20:32:27 crc kubenswrapper[4885]: I1205 20:32:27.851375 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" Dec 05 20:32:28 crc kubenswrapper[4885]: W1205 20:32:28.389778 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda16820a2_be4e_45d6_bcef_91810571b95f.slice/crio-3151f0a291843f31e1341a7e92a6b06e95c6883dedbfc32c1484eaf173ba25d4 WatchSource:0}: Error finding container 3151f0a291843f31e1341a7e92a6b06e95c6883dedbfc32c1484eaf173ba25d4: Status 404 returned error can't find the container with id 3151f0a291843f31e1341a7e92a6b06e95c6883dedbfc32c1484eaf173ba25d4 Dec 05 20:32:28 crc kubenswrapper[4885]: I1205 20:32:28.390500 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9"] Dec 05 20:32:28 crc kubenswrapper[4885]: I1205 20:32:28.421980 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" event={"ID":"a16820a2-be4e-45d6-bcef-91810571b95f","Type":"ContainerStarted","Data":"3151f0a291843f31e1341a7e92a6b06e95c6883dedbfc32c1484eaf173ba25d4"} Dec 05 20:32:29 crc kubenswrapper[4885]: I1205 20:32:29.435597 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" event={"ID":"a16820a2-be4e-45d6-bcef-91810571b95f","Type":"ContainerStarted","Data":"1c45435b31495f7c36020ebbc33ce9f8e0941ffc7979103229b16c8b7f823271"} Dec 05 20:32:29 crc kubenswrapper[4885]: I1205 20:32:29.462731 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" podStartSLOduration=2.047863131 podStartE2EDuration="2.462703929s" podCreationTimestamp="2025-12-05 20:32:27 +0000 UTC" firstStartedPulling="2025-12-05 20:32:28.392140162 +0000 UTC m=+1613.688955833" lastFinishedPulling="2025-12-05 20:32:28.80698097 +0000 UTC m=+1614.103796631" observedRunningTime="2025-12-05 20:32:29.453894773 +0000 UTC m=+1614.750710464" watchObservedRunningTime="2025-12-05 20:32:29.462703929 +0000 UTC m=+1614.759519590" Dec 05 20:32:38 crc kubenswrapper[4885]: I1205 20:32:38.172539 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:32:38 crc kubenswrapper[4885]: E1205 20:32:38.173295 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:32:51 crc kubenswrapper[4885]: I1205 20:32:51.175427 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:32:51 crc kubenswrapper[4885]: E1205 20:32:51.176217 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:32:53 crc kubenswrapper[4885]: I1205 20:32:53.234721 4885 scope.go:117] "RemoveContainer" containerID="e41f111392e604df32d47b83bac1edfb7317e5efe6a6496250a619df930a3ebc" Dec 05 20:32:53 crc kubenswrapper[4885]: I1205 20:32:53.261770 4885 scope.go:117] "RemoveContainer" containerID="3b5088b7655762e894c5483c300c45f9247d9c330fa09bc966035c2930b6e0ee" Dec 05 20:32:53 crc kubenswrapper[4885]: I1205 20:32:53.284840 4885 scope.go:117] "RemoveContainer" containerID="9a13385ba5146482e1ce3d79a09d837232e20e1d7622901bf8f6d54d0d03433f" Dec 05 20:32:53 crc kubenswrapper[4885]: I1205 20:32:53.319439 4885 scope.go:117] "RemoveContainer" containerID="5f0393a95995f9e28baffd6fdf0f1ab0081bd14e7ef33553c73ece93657bbb83" Dec 05 20:32:53 crc kubenswrapper[4885]: I1205 20:32:53.343255 4885 scope.go:117] "RemoveContainer" containerID="99426b8a91ce79a2fbf126d13e9637a7ee46de85c53a0fd3b3cb46e6fff324b0" Dec 05 20:32:53 crc kubenswrapper[4885]: I1205 20:32:53.363944 4885 scope.go:117] "RemoveContainer" containerID="6d79bb4c47a9cfe5885bb13b5e4b860cf255d87e77d729673b45b16dfe054862" Dec 05 20:32:53 crc kubenswrapper[4885]: I1205 20:32:53.385654 4885 scope.go:117] "RemoveContainer" containerID="82bd2550776c48980a4eadd10c0e1df19e101cb4a3ff0d65ea35a405e2624015" Dec 05 20:32:53 crc kubenswrapper[4885]: I1205 20:32:53.413928 4885 scope.go:117] "RemoveContainer" containerID="64001c8f95977d9d006aa1bdf92e0186a77e6651db187862588d6edb57d88429" Dec 05 20:33:04 crc kubenswrapper[4885]: I1205 20:33:04.173084 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:33:04 crc kubenswrapper[4885]: E1205 20:33:04.173639 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:33:15 crc kubenswrapper[4885]: I1205 20:33:15.186933 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:33:15 crc kubenswrapper[4885]: E1205 20:33:15.188173 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:33:20 crc kubenswrapper[4885]: I1205 20:33:20.050604 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4e7e-account-create-update-8xkqd"] Dec 05 20:33:20 crc kubenswrapper[4885]: I1205 20:33:20.064766 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-nlh7l"] Dec 05 20:33:20 crc kubenswrapper[4885]: I1205 20:33:20.074589 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-nlh7l"] Dec 05 20:33:20 crc kubenswrapper[4885]: I1205 20:33:20.081824 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4e7e-account-create-update-8xkqd"] Dec 05 20:33:21 crc kubenswrapper[4885]: I1205 20:33:21.041487 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-kphfn"] Dec 05 20:33:21 crc kubenswrapper[4885]: I1205 20:33:21.058947 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-4a97-account-create-update-58vvg"] Dec 05 20:33:21 crc kubenswrapper[4885]: I1205 20:33:21.094700 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-4a97-account-create-update-58vvg"] Dec 05 20:33:21 crc kubenswrapper[4885]: I1205 20:33:21.108151 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-kphfn"] Dec 05 20:33:21 crc kubenswrapper[4885]: I1205 20:33:21.191858 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="080544c2-141c-49d0-86a9-533fefe28a4f" path="/var/lib/kubelet/pods/080544c2-141c-49d0-86a9-533fefe28a4f/volumes" Dec 05 20:33:21 crc kubenswrapper[4885]: I1205 20:33:21.193287 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4741a673-bd48-498b-bade-5b2dfb1b0cce" path="/var/lib/kubelet/pods/4741a673-bd48-498b-bade-5b2dfb1b0cce/volumes" Dec 05 20:33:21 crc kubenswrapper[4885]: I1205 20:33:21.194820 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cdb57a5-2227-4495-b30d-e0867eba0435" path="/var/lib/kubelet/pods/4cdb57a5-2227-4495-b30d-e0867eba0435/volumes" Dec 05 20:33:21 crc kubenswrapper[4885]: I1205 20:33:21.196188 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cb59301-abf6-47e6-9f76-86e7908c07f2" path="/var/lib/kubelet/pods/7cb59301-abf6-47e6-9f76-86e7908c07f2/volumes" Dec 05 20:33:26 crc kubenswrapper[4885]: I1205 20:33:26.048605 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-z548t"] Dec 05 20:33:26 crc kubenswrapper[4885]: I1205 20:33:26.056459 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9380-account-create-update-dnrjv"] Dec 05 20:33:26 crc kubenswrapper[4885]: I1205 20:33:26.063369 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9380-account-create-update-dnrjv"] Dec 05 20:33:26 crc kubenswrapper[4885]: I1205 20:33:26.071451 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-z548t"] Dec 05 20:33:26 crc kubenswrapper[4885]: I1205 20:33:26.173009 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:33:26 crc kubenswrapper[4885]: E1205 20:33:26.173443 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:33:27 crc kubenswrapper[4885]: I1205 20:33:27.185434 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78491fc8-8cb0-489e-98b5-a3f19812d082" path="/var/lib/kubelet/pods/78491fc8-8cb0-489e-98b5-a3f19812d082/volumes" Dec 05 20:33:27 crc kubenswrapper[4885]: I1205 20:33:27.186344 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82526f77-6157-4c1e-b14d-72e377c0971b" path="/var/lib/kubelet/pods/82526f77-6157-4c1e-b14d-72e377c0971b/volumes" Dec 05 20:33:40 crc kubenswrapper[4885]: I1205 20:33:40.174544 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:33:40 crc kubenswrapper[4885]: E1205 20:33:40.175384 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:33:52 crc kubenswrapper[4885]: I1205 20:33:52.038790 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f105-account-create-update-d94w4"] Dec 05 20:33:52 crc kubenswrapper[4885]: I1205 20:33:52.052259 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qg5dj"] Dec 05 20:33:52 crc kubenswrapper[4885]: I1205 20:33:52.060203 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-486b-account-create-update-8dkrv"] Dec 05 20:33:52 crc kubenswrapper[4885]: I1205 20:33:52.068775 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qg5dj"] Dec 05 20:33:52 crc kubenswrapper[4885]: I1205 20:33:52.076215 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-63fc-account-create-update-zvpzs"] Dec 05 20:33:52 crc kubenswrapper[4885]: I1205 20:33:52.083427 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f105-account-create-update-d94w4"] Dec 05 20:33:52 crc kubenswrapper[4885]: I1205 20:33:52.091436 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-9ph7x"] Dec 05 20:33:52 crc kubenswrapper[4885]: I1205 20:33:52.100567 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-63fc-account-create-update-zvpzs"] Dec 05 20:33:52 crc kubenswrapper[4885]: I1205 20:33:52.108567 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-486b-account-create-update-8dkrv"] Dec 05 20:33:52 crc kubenswrapper[4885]: I1205 20:33:52.116013 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-9ph7x"] Dec 05 20:33:52 crc kubenswrapper[4885]: I1205 20:33:52.123804 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-b7z2f"] Dec 05 20:33:52 crc kubenswrapper[4885]: I1205 20:33:52.131245 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-b7z2f"] Dec 05 20:33:52 crc kubenswrapper[4885]: I1205 20:33:52.172439 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:33:52 crc kubenswrapper[4885]: E1205 20:33:52.172753 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.187181 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077192b6-b7a8-4da8-b840-8486e927178f" path="/var/lib/kubelet/pods/077192b6-b7a8-4da8-b840-8486e927178f/volumes" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.188282 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d27b3a6-f5ea-4e96-b5f5-22db1454767c" path="/var/lib/kubelet/pods/0d27b3a6-f5ea-4e96-b5f5-22db1454767c/volumes" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.189042 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34a1b3cc-24e0-48ac-af60-1a740a0f6103" path="/var/lib/kubelet/pods/34a1b3cc-24e0-48ac-af60-1a740a0f6103/volumes" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.189789 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d946273-15f3-46e4-a64e-7fb5cbcce090" path="/var/lib/kubelet/pods/3d946273-15f3-46e4-a64e-7fb5cbcce090/volumes" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.191153 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd" path="/var/lib/kubelet/pods/f33cd1e3-5669-4c58-a129-4ef5c4c5e1dd/volumes" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.191917 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcfdb1c5-4c20-42eb-9a6e-e8716d226881" path="/var/lib/kubelet/pods/fcfdb1c5-4c20-42eb-9a6e-e8716d226881/volumes" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.530490 4885 scope.go:117] "RemoveContainer" containerID="90a9df812a347b1c68cb393348dbca8f688fadb9f85e156a8f21a37fa650fc72" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.568958 4885 scope.go:117] "RemoveContainer" containerID="eeba5ae1b91678d4ac3e809b5740ea2a82a2c0f7fd0de4fde8e69ac0969323b3" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.627788 4885 scope.go:117] "RemoveContainer" containerID="d2fd1b45172063e39a70ee6b2dd01a27495fcbf2190a1a169e47906834c8ddc0" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.669712 4885 scope.go:117] "RemoveContainer" containerID="c27607964e29b261a360669f935dc6fe0b288a7336052682463c372d4703ac84" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.714247 4885 scope.go:117] "RemoveContainer" containerID="34d33acc37f610c18801fded2160e52ac8145fb1c29cdcc67c068ddacc35df31" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.758776 4885 scope.go:117] "RemoveContainer" containerID="ac155aa072a92cc10d6f357a15f70afb1d63ad6062ef817174bc0d33e02e88d8" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.807836 4885 scope.go:117] "RemoveContainer" containerID="285344bc6bd10b5d64b925d943af4a7898fbcecec219dc4f6bbc71d172313cb8" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.834129 4885 scope.go:117] "RemoveContainer" containerID="1e88419f74b85dfab715e45192413e6fceb421fed02931ab59ff4298c6cd7220" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.873318 4885 scope.go:117] "RemoveContainer" containerID="ddc8cb6070ddda4869cedd5218287eeae217a7f0ed5a190af26bffc8c30275bb" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.906664 4885 scope.go:117] "RemoveContainer" containerID="5f395e573889f524a1a01c75a51e6089a32fa2e080d8e064d0d7d5d3cf76136c" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.932857 4885 scope.go:117] "RemoveContainer" containerID="08a9618affed1c0c7b45043b426f556e6034ceef394f856fa6e0c76bc9428633" Dec 05 20:33:53 crc kubenswrapper[4885]: I1205 20:33:53.957935 4885 scope.go:117] "RemoveContainer" containerID="717a425951e4fb12b77d67dd1a63826a7c6d54ddbaa73f8c26ec23dffce1231b" Dec 05 20:33:57 crc kubenswrapper[4885]: I1205 20:33:57.043586 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-hdlzl"] Dec 05 20:33:57 crc kubenswrapper[4885]: I1205 20:33:57.053879 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-hdlzl"] Dec 05 20:33:57 crc kubenswrapper[4885]: I1205 20:33:57.186853 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="483b86cb-8402-4f2d-8423-7f88ff0cc353" path="/var/lib/kubelet/pods/483b86cb-8402-4f2d-8423-7f88ff0cc353/volumes" Dec 05 20:34:03 crc kubenswrapper[4885]: I1205 20:34:03.173804 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:34:03 crc kubenswrapper[4885]: E1205 20:34:03.175387 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:34:07 crc kubenswrapper[4885]: I1205 20:34:07.455340 4885 generic.go:334] "Generic (PLEG): container finished" podID="a16820a2-be4e-45d6-bcef-91810571b95f" containerID="1c45435b31495f7c36020ebbc33ce9f8e0941ffc7979103229b16c8b7f823271" exitCode=0 Dec 05 20:34:07 crc kubenswrapper[4885]: I1205 20:34:07.455425 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" event={"ID":"a16820a2-be4e-45d6-bcef-91810571b95f","Type":"ContainerDied","Data":"1c45435b31495f7c36020ebbc33ce9f8e0941ffc7979103229b16c8b7f823271"} Dec 05 20:34:08 crc kubenswrapper[4885]: I1205 20:34:08.896152 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.093822 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a16820a2-be4e-45d6-bcef-91810571b95f-inventory\") pod \"a16820a2-be4e-45d6-bcef-91810571b95f\" (UID: \"a16820a2-be4e-45d6-bcef-91810571b95f\") " Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.094059 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgqzm\" (UniqueName: \"kubernetes.io/projected/a16820a2-be4e-45d6-bcef-91810571b95f-kube-api-access-lgqzm\") pod \"a16820a2-be4e-45d6-bcef-91810571b95f\" (UID: \"a16820a2-be4e-45d6-bcef-91810571b95f\") " Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.094093 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a16820a2-be4e-45d6-bcef-91810571b95f-ssh-key\") pod \"a16820a2-be4e-45d6-bcef-91810571b95f\" (UID: \"a16820a2-be4e-45d6-bcef-91810571b95f\") " Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.100284 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a16820a2-be4e-45d6-bcef-91810571b95f-kube-api-access-lgqzm" (OuterVolumeSpecName: "kube-api-access-lgqzm") pod "a16820a2-be4e-45d6-bcef-91810571b95f" (UID: "a16820a2-be4e-45d6-bcef-91810571b95f"). InnerVolumeSpecName "kube-api-access-lgqzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.127048 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a16820a2-be4e-45d6-bcef-91810571b95f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a16820a2-be4e-45d6-bcef-91810571b95f" (UID: "a16820a2-be4e-45d6-bcef-91810571b95f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.147609 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a16820a2-be4e-45d6-bcef-91810571b95f-inventory" (OuterVolumeSpecName: "inventory") pod "a16820a2-be4e-45d6-bcef-91810571b95f" (UID: "a16820a2-be4e-45d6-bcef-91810571b95f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.195751 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a16820a2-be4e-45d6-bcef-91810571b95f-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.195778 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgqzm\" (UniqueName: \"kubernetes.io/projected/a16820a2-be4e-45d6-bcef-91810571b95f-kube-api-access-lgqzm\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.195789 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a16820a2-be4e-45d6-bcef-91810571b95f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.482059 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" event={"ID":"a16820a2-be4e-45d6-bcef-91810571b95f","Type":"ContainerDied","Data":"3151f0a291843f31e1341a7e92a6b06e95c6883dedbfc32c1484eaf173ba25d4"} Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.482095 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3151f0a291843f31e1341a7e92a6b06e95c6883dedbfc32c1484eaf173ba25d4" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.482146 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.571698 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26"] Dec 05 20:34:09 crc kubenswrapper[4885]: E1205 20:34:09.572281 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16820a2-be4e-45d6-bcef-91810571b95f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.572314 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16820a2-be4e-45d6-bcef-91810571b95f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.572638 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a16820a2-be4e-45d6-bcef-91810571b95f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.573949 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.575806 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.576173 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.576419 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.576976 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.580424 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26"] Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.703521 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmjnc\" (UniqueName: \"kubernetes.io/projected/cf7e7e25-a243-4caf-8b1a-34c1830a097e-kube-api-access-cmjnc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nvh26\" (UID: \"cf7e7e25-a243-4caf-8b1a-34c1830a097e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.703621 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf7e7e25-a243-4caf-8b1a-34c1830a097e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nvh26\" (UID: \"cf7e7e25-a243-4caf-8b1a-34c1830a097e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.704411 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf7e7e25-a243-4caf-8b1a-34c1830a097e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nvh26\" (UID: \"cf7e7e25-a243-4caf-8b1a-34c1830a097e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.807224 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmjnc\" (UniqueName: \"kubernetes.io/projected/cf7e7e25-a243-4caf-8b1a-34c1830a097e-kube-api-access-cmjnc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nvh26\" (UID: \"cf7e7e25-a243-4caf-8b1a-34c1830a097e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.807669 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf7e7e25-a243-4caf-8b1a-34c1830a097e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nvh26\" (UID: \"cf7e7e25-a243-4caf-8b1a-34c1830a097e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.807744 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf7e7e25-a243-4caf-8b1a-34c1830a097e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nvh26\" (UID: \"cf7e7e25-a243-4caf-8b1a-34c1830a097e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.818950 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf7e7e25-a243-4caf-8b1a-34c1830a097e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nvh26\" (UID: \"cf7e7e25-a243-4caf-8b1a-34c1830a097e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.818993 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf7e7e25-a243-4caf-8b1a-34c1830a097e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nvh26\" (UID: \"cf7e7e25-a243-4caf-8b1a-34c1830a097e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.829794 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmjnc\" (UniqueName: \"kubernetes.io/projected/cf7e7e25-a243-4caf-8b1a-34c1830a097e-kube-api-access-cmjnc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nvh26\" (UID: \"cf7e7e25-a243-4caf-8b1a-34c1830a097e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" Dec 05 20:34:09 crc kubenswrapper[4885]: I1205 20:34:09.895595 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" Dec 05 20:34:10 crc kubenswrapper[4885]: I1205 20:34:10.391235 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:34:10 crc kubenswrapper[4885]: I1205 20:34:10.391959 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26"] Dec 05 20:34:10 crc kubenswrapper[4885]: I1205 20:34:10.490546 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" event={"ID":"cf7e7e25-a243-4caf-8b1a-34c1830a097e","Type":"ContainerStarted","Data":"5262086fd15de752abb90542302dfd8a031f29b1c11e5e695ca7d966250f50fe"} Dec 05 20:34:11 crc kubenswrapper[4885]: I1205 20:34:11.501217 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" event={"ID":"cf7e7e25-a243-4caf-8b1a-34c1830a097e","Type":"ContainerStarted","Data":"73ccd334aadab0d570911f45e0673bcf5f6527b349cce293ddb402c846a76bbf"} Dec 05 20:34:11 crc kubenswrapper[4885]: I1205 20:34:11.525365 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" podStartSLOduration=1.961500359 podStartE2EDuration="2.525345772s" podCreationTimestamp="2025-12-05 20:34:09 +0000 UTC" firstStartedPulling="2025-12-05 20:34:10.3910173 +0000 UTC m=+1715.687832961" lastFinishedPulling="2025-12-05 20:34:10.954862703 +0000 UTC m=+1716.251678374" observedRunningTime="2025-12-05 20:34:11.517656282 +0000 UTC m=+1716.814471943" watchObservedRunningTime="2025-12-05 20:34:11.525345772 +0000 UTC m=+1716.822161433" Dec 05 20:34:17 crc kubenswrapper[4885]: I1205 20:34:17.172849 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:34:17 crc kubenswrapper[4885]: E1205 20:34:17.173749 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:34:28 crc kubenswrapper[4885]: I1205 20:34:28.175366 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:34:28 crc kubenswrapper[4885]: E1205 20:34:28.177545 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:34:31 crc kubenswrapper[4885]: I1205 20:34:31.041668 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-v5n6g"] Dec 05 20:34:31 crc kubenswrapper[4885]: I1205 20:34:31.052549 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-v5n6g"] Dec 05 20:34:31 crc kubenswrapper[4885]: I1205 20:34:31.190895 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c0c93a6-1c5d-49b8-b56b-92460295ec1a" path="/var/lib/kubelet/pods/7c0c93a6-1c5d-49b8-b56b-92460295ec1a/volumes" Dec 05 20:34:42 crc kubenswrapper[4885]: I1205 20:34:42.052580 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-cct8h"] Dec 05 20:34:42 crc kubenswrapper[4885]: I1205 20:34:42.071627 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-cct8h"] Dec 05 20:34:42 crc kubenswrapper[4885]: I1205 20:34:42.173461 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:34:42 crc kubenswrapper[4885]: E1205 20:34:42.173711 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:34:43 crc kubenswrapper[4885]: I1205 20:34:43.043248 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-w6258"] Dec 05 20:34:43 crc kubenswrapper[4885]: I1205 20:34:43.059377 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-w6258"] Dec 05 20:34:43 crc kubenswrapper[4885]: I1205 20:34:43.185523 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9be03938-1d91-45a5-beba-a54b318fc799" path="/var/lib/kubelet/pods/9be03938-1d91-45a5-beba-a54b318fc799/volumes" Dec 05 20:34:43 crc kubenswrapper[4885]: I1205 20:34:43.186524 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9cd60a9-9ff8-4b35-9069-4e406b9771e1" path="/var/lib/kubelet/pods/c9cd60a9-9ff8-4b35-9069-4e406b9771e1/volumes" Dec 05 20:34:50 crc kubenswrapper[4885]: I1205 20:34:50.036355 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-dsgxp"] Dec 05 20:34:50 crc kubenswrapper[4885]: I1205 20:34:50.052809 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-dsgxp"] Dec 05 20:34:51 crc kubenswrapper[4885]: I1205 20:34:51.185543 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af42085d-f7f5-4dd5-86d1-7019ba4d0888" path="/var/lib/kubelet/pods/af42085d-f7f5-4dd5-86d1-7019ba4d0888/volumes" Dec 05 20:34:52 crc kubenswrapper[4885]: I1205 20:34:52.043136 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-6jq57"] Dec 05 20:34:52 crc kubenswrapper[4885]: I1205 20:34:52.051951 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-6jq57"] Dec 05 20:34:53 crc kubenswrapper[4885]: I1205 20:34:53.184660 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a908e8-64e1-4fec-b455-66527f7efee3" path="/var/lib/kubelet/pods/e4a908e8-64e1-4fec-b455-66527f7efee3/volumes" Dec 05 20:34:54 crc kubenswrapper[4885]: I1205 20:34:54.172882 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:34:54 crc kubenswrapper[4885]: E1205 20:34:54.173158 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:34:54 crc kubenswrapper[4885]: I1205 20:34:54.287931 4885 scope.go:117] "RemoveContainer" containerID="733b7abb9b6783fa9892ee608b15a56dccdccc24196a35763797acfd3fe31d85" Dec 05 20:34:54 crc kubenswrapper[4885]: I1205 20:34:54.343625 4885 scope.go:117] "RemoveContainer" containerID="b8ff479621e3db136b46b8e45f013a9e4ae7973dde8e3205e8fda0e34ba387b2" Dec 05 20:34:54 crc kubenswrapper[4885]: I1205 20:34:54.390009 4885 scope.go:117] "RemoveContainer" containerID="09c866db04b1255facc89243c66caaf1aa66013cccd0c1fd2f09d93f1c4462c8" Dec 05 20:34:54 crc kubenswrapper[4885]: I1205 20:34:54.433136 4885 scope.go:117] "RemoveContainer" containerID="7eda14d121200765d3e9ceee44920c17d2cd102f764cced4e39ea638bfb7c831" Dec 05 20:34:54 crc kubenswrapper[4885]: I1205 20:34:54.486174 4885 scope.go:117] "RemoveContainer" containerID="79e7b3c2ed82726f00d2118846fb01e953e32a733ac03d1fb6e7186bad673750" Dec 05 20:34:54 crc kubenswrapper[4885]: I1205 20:34:54.531417 4885 scope.go:117] "RemoveContainer" containerID="3d283843e63c03be7d9bef8cdada2311901b50fe69cbc53fa6e187d1a092694b" Dec 05 20:34:58 crc kubenswrapper[4885]: I1205 20:34:58.061726 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-5szt6"] Dec 05 20:34:58 crc kubenswrapper[4885]: I1205 20:34:58.079341 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-5szt6"] Dec 05 20:34:59 crc kubenswrapper[4885]: I1205 20:34:59.193656 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88521675-6180-4a17-ba7d-6bb9eb07e7dd" path="/var/lib/kubelet/pods/88521675-6180-4a17-ba7d-6bb9eb07e7dd/volumes" Dec 05 20:35:08 crc kubenswrapper[4885]: I1205 20:35:08.173761 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:35:08 crc kubenswrapper[4885]: E1205 20:35:08.174981 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:35:21 crc kubenswrapper[4885]: I1205 20:35:21.173200 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:35:21 crc kubenswrapper[4885]: E1205 20:35:21.173962 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:35:24 crc kubenswrapper[4885]: I1205 20:35:24.265952 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf7e7e25-a243-4caf-8b1a-34c1830a097e" containerID="73ccd334aadab0d570911f45e0673bcf5f6527b349cce293ddb402c846a76bbf" exitCode=0 Dec 05 20:35:24 crc kubenswrapper[4885]: I1205 20:35:24.266083 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" event={"ID":"cf7e7e25-a243-4caf-8b1a-34c1830a097e","Type":"ContainerDied","Data":"73ccd334aadab0d570911f45e0673bcf5f6527b349cce293ddb402c846a76bbf"} Dec 05 20:35:25 crc kubenswrapper[4885]: I1205 20:35:25.724497 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" Dec 05 20:35:25 crc kubenswrapper[4885]: I1205 20:35:25.875987 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf7e7e25-a243-4caf-8b1a-34c1830a097e-ssh-key\") pod \"cf7e7e25-a243-4caf-8b1a-34c1830a097e\" (UID: \"cf7e7e25-a243-4caf-8b1a-34c1830a097e\") " Dec 05 20:35:25 crc kubenswrapper[4885]: I1205 20:35:25.876376 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf7e7e25-a243-4caf-8b1a-34c1830a097e-inventory\") pod \"cf7e7e25-a243-4caf-8b1a-34c1830a097e\" (UID: \"cf7e7e25-a243-4caf-8b1a-34c1830a097e\") " Dec 05 20:35:25 crc kubenswrapper[4885]: I1205 20:35:25.876428 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmjnc\" (UniqueName: \"kubernetes.io/projected/cf7e7e25-a243-4caf-8b1a-34c1830a097e-kube-api-access-cmjnc\") pod \"cf7e7e25-a243-4caf-8b1a-34c1830a097e\" (UID: \"cf7e7e25-a243-4caf-8b1a-34c1830a097e\") " Dec 05 20:35:25 crc kubenswrapper[4885]: I1205 20:35:25.882280 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf7e7e25-a243-4caf-8b1a-34c1830a097e-kube-api-access-cmjnc" (OuterVolumeSpecName: "kube-api-access-cmjnc") pod "cf7e7e25-a243-4caf-8b1a-34c1830a097e" (UID: "cf7e7e25-a243-4caf-8b1a-34c1830a097e"). InnerVolumeSpecName "kube-api-access-cmjnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:25 crc kubenswrapper[4885]: I1205 20:35:25.920605 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7e7e25-a243-4caf-8b1a-34c1830a097e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cf7e7e25-a243-4caf-8b1a-34c1830a097e" (UID: "cf7e7e25-a243-4caf-8b1a-34c1830a097e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:25 crc kubenswrapper[4885]: I1205 20:35:25.930602 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7e7e25-a243-4caf-8b1a-34c1830a097e-inventory" (OuterVolumeSpecName: "inventory") pod "cf7e7e25-a243-4caf-8b1a-34c1830a097e" (UID: "cf7e7e25-a243-4caf-8b1a-34c1830a097e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:25 crc kubenswrapper[4885]: I1205 20:35:25.979315 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf7e7e25-a243-4caf-8b1a-34c1830a097e-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:25 crc kubenswrapper[4885]: I1205 20:35:25.979355 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmjnc\" (UniqueName: \"kubernetes.io/projected/cf7e7e25-a243-4caf-8b1a-34c1830a097e-kube-api-access-cmjnc\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:25 crc kubenswrapper[4885]: I1205 20:35:25.979366 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf7e7e25-a243-4caf-8b1a-34c1830a097e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.289167 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" event={"ID":"cf7e7e25-a243-4caf-8b1a-34c1830a097e","Type":"ContainerDied","Data":"5262086fd15de752abb90542302dfd8a031f29b1c11e5e695ca7d966250f50fe"} Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.289850 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5262086fd15de752abb90542302dfd8a031f29b1c11e5e695ca7d966250f50fe" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.289339 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nvh26" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.380713 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q"] Dec 05 20:35:26 crc kubenswrapper[4885]: E1205 20:35:26.381220 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7e7e25-a243-4caf-8b1a-34c1830a097e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.381246 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7e7e25-a243-4caf-8b1a-34c1830a097e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.381493 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7e7e25-a243-4caf-8b1a-34c1830a097e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.382278 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.384103 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.384919 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.385364 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.385608 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.402373 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q"] Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.491688 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6fcaa99-97aa-46d8-be19-5cac454e2f77-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q\" (UID: \"f6fcaa99-97aa-46d8-be19-5cac454e2f77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.491798 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9tdh\" (UniqueName: \"kubernetes.io/projected/f6fcaa99-97aa-46d8-be19-5cac454e2f77-kube-api-access-l9tdh\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q\" (UID: \"f6fcaa99-97aa-46d8-be19-5cac454e2f77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.491983 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6fcaa99-97aa-46d8-be19-5cac454e2f77-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q\" (UID: \"f6fcaa99-97aa-46d8-be19-5cac454e2f77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.593830 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6fcaa99-97aa-46d8-be19-5cac454e2f77-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q\" (UID: \"f6fcaa99-97aa-46d8-be19-5cac454e2f77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.593986 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6fcaa99-97aa-46d8-be19-5cac454e2f77-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q\" (UID: \"f6fcaa99-97aa-46d8-be19-5cac454e2f77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.594109 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9tdh\" (UniqueName: \"kubernetes.io/projected/f6fcaa99-97aa-46d8-be19-5cac454e2f77-kube-api-access-l9tdh\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q\" (UID: \"f6fcaa99-97aa-46d8-be19-5cac454e2f77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.601658 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6fcaa99-97aa-46d8-be19-5cac454e2f77-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q\" (UID: \"f6fcaa99-97aa-46d8-be19-5cac454e2f77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.603035 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6fcaa99-97aa-46d8-be19-5cac454e2f77-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q\" (UID: \"f6fcaa99-97aa-46d8-be19-5cac454e2f77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.615070 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9tdh\" (UniqueName: \"kubernetes.io/projected/f6fcaa99-97aa-46d8-be19-5cac454e2f77-kube-api-access-l9tdh\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q\" (UID: \"f6fcaa99-97aa-46d8-be19-5cac454e2f77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" Dec 05 20:35:26 crc kubenswrapper[4885]: I1205 20:35:26.700067 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" Dec 05 20:35:27 crc kubenswrapper[4885]: I1205 20:35:27.254830 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q"] Dec 05 20:35:27 crc kubenswrapper[4885]: I1205 20:35:27.302891 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" event={"ID":"f6fcaa99-97aa-46d8-be19-5cac454e2f77","Type":"ContainerStarted","Data":"a0a651406f8afc8209bea5562b854b0d0f3eba7daa46c04ef41d919a2acf34c8"} Dec 05 20:35:28 crc kubenswrapper[4885]: I1205 20:35:28.315176 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" event={"ID":"f6fcaa99-97aa-46d8-be19-5cac454e2f77","Type":"ContainerStarted","Data":"72da5c2dad84dabb8958b687642b8b146bc32efb9e9e9e34deb4849d3375dba9"} Dec 05 20:35:28 crc kubenswrapper[4885]: I1205 20:35:28.339291 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" podStartSLOduration=1.928518402 podStartE2EDuration="2.339270348s" podCreationTimestamp="2025-12-05 20:35:26 +0000 UTC" firstStartedPulling="2025-12-05 20:35:27.261079689 +0000 UTC m=+1792.557895390" lastFinishedPulling="2025-12-05 20:35:27.671831675 +0000 UTC m=+1792.968647336" observedRunningTime="2025-12-05 20:35:28.335676566 +0000 UTC m=+1793.632492277" watchObservedRunningTime="2025-12-05 20:35:28.339270348 +0000 UTC m=+1793.636086019" Dec 05 20:35:33 crc kubenswrapper[4885]: I1205 20:35:33.173563 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:35:33 crc kubenswrapper[4885]: E1205 20:35:33.174650 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:35:33 crc kubenswrapper[4885]: I1205 20:35:33.369153 4885 generic.go:334] "Generic (PLEG): container finished" podID="f6fcaa99-97aa-46d8-be19-5cac454e2f77" containerID="72da5c2dad84dabb8958b687642b8b146bc32efb9e9e9e34deb4849d3375dba9" exitCode=0 Dec 05 20:35:33 crc kubenswrapper[4885]: I1205 20:35:33.369231 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" event={"ID":"f6fcaa99-97aa-46d8-be19-5cac454e2f77","Type":"ContainerDied","Data":"72da5c2dad84dabb8958b687642b8b146bc32efb9e9e9e34deb4849d3375dba9"} Dec 05 20:35:34 crc kubenswrapper[4885]: I1205 20:35:34.848074 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" Dec 05 20:35:34 crc kubenswrapper[4885]: I1205 20:35:34.909535 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6fcaa99-97aa-46d8-be19-5cac454e2f77-ssh-key\") pod \"f6fcaa99-97aa-46d8-be19-5cac454e2f77\" (UID: \"f6fcaa99-97aa-46d8-be19-5cac454e2f77\") " Dec 05 20:35:34 crc kubenswrapper[4885]: I1205 20:35:34.909664 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9tdh\" (UniqueName: \"kubernetes.io/projected/f6fcaa99-97aa-46d8-be19-5cac454e2f77-kube-api-access-l9tdh\") pod \"f6fcaa99-97aa-46d8-be19-5cac454e2f77\" (UID: \"f6fcaa99-97aa-46d8-be19-5cac454e2f77\") " Dec 05 20:35:34 crc kubenswrapper[4885]: I1205 20:35:34.909915 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6fcaa99-97aa-46d8-be19-5cac454e2f77-inventory\") pod \"f6fcaa99-97aa-46d8-be19-5cac454e2f77\" (UID: \"f6fcaa99-97aa-46d8-be19-5cac454e2f77\") " Dec 05 20:35:34 crc kubenswrapper[4885]: I1205 20:35:34.915066 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6fcaa99-97aa-46d8-be19-5cac454e2f77-kube-api-access-l9tdh" (OuterVolumeSpecName: "kube-api-access-l9tdh") pod "f6fcaa99-97aa-46d8-be19-5cac454e2f77" (UID: "f6fcaa99-97aa-46d8-be19-5cac454e2f77"). InnerVolumeSpecName "kube-api-access-l9tdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:34 crc kubenswrapper[4885]: I1205 20:35:34.936399 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6fcaa99-97aa-46d8-be19-5cac454e2f77-inventory" (OuterVolumeSpecName: "inventory") pod "f6fcaa99-97aa-46d8-be19-5cac454e2f77" (UID: "f6fcaa99-97aa-46d8-be19-5cac454e2f77"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:34 crc kubenswrapper[4885]: I1205 20:35:34.948174 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6fcaa99-97aa-46d8-be19-5cac454e2f77-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f6fcaa99-97aa-46d8-be19-5cac454e2f77" (UID: "f6fcaa99-97aa-46d8-be19-5cac454e2f77"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.014099 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6fcaa99-97aa-46d8-be19-5cac454e2f77-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.014128 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6fcaa99-97aa-46d8-be19-5cac454e2f77-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.014139 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9tdh\" (UniqueName: \"kubernetes.io/projected/f6fcaa99-97aa-46d8-be19-5cac454e2f77-kube-api-access-l9tdh\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.399144 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" event={"ID":"f6fcaa99-97aa-46d8-be19-5cac454e2f77","Type":"ContainerDied","Data":"a0a651406f8afc8209bea5562b854b0d0f3eba7daa46c04ef41d919a2acf34c8"} Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.399195 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.399200 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0a651406f8afc8209bea5562b854b0d0f3eba7daa46c04ef41d919a2acf34c8" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.492093 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z"] Dec 05 20:35:35 crc kubenswrapper[4885]: E1205 20:35:35.492780 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6fcaa99-97aa-46d8-be19-5cac454e2f77" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.492823 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6fcaa99-97aa-46d8-be19-5cac454e2f77" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.493211 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6fcaa99-97aa-46d8-be19-5cac454e2f77" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.494327 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.497445 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.497591 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.497511 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.501940 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.503376 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z"] Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.566075 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhwvd\" (UniqueName: \"kubernetes.io/projected/d0a9ab2d-1012-41ba-b810-c7f7f127330e-kube-api-access-rhwvd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d6c5z\" (UID: \"d0a9ab2d-1012-41ba-b810-c7f7f127330e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.566290 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0a9ab2d-1012-41ba-b810-c7f7f127330e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d6c5z\" (UID: \"d0a9ab2d-1012-41ba-b810-c7f7f127330e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.566363 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0a9ab2d-1012-41ba-b810-c7f7f127330e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d6c5z\" (UID: \"d0a9ab2d-1012-41ba-b810-c7f7f127330e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.668388 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhwvd\" (UniqueName: \"kubernetes.io/projected/d0a9ab2d-1012-41ba-b810-c7f7f127330e-kube-api-access-rhwvd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d6c5z\" (UID: \"d0a9ab2d-1012-41ba-b810-c7f7f127330e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.668870 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0a9ab2d-1012-41ba-b810-c7f7f127330e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d6c5z\" (UID: \"d0a9ab2d-1012-41ba-b810-c7f7f127330e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.668920 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0a9ab2d-1012-41ba-b810-c7f7f127330e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d6c5z\" (UID: \"d0a9ab2d-1012-41ba-b810-c7f7f127330e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.674464 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0a9ab2d-1012-41ba-b810-c7f7f127330e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d6c5z\" (UID: \"d0a9ab2d-1012-41ba-b810-c7f7f127330e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.674514 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0a9ab2d-1012-41ba-b810-c7f7f127330e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d6c5z\" (UID: \"d0a9ab2d-1012-41ba-b810-c7f7f127330e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.695244 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhwvd\" (UniqueName: \"kubernetes.io/projected/d0a9ab2d-1012-41ba-b810-c7f7f127330e-kube-api-access-rhwvd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d6c5z\" (UID: \"d0a9ab2d-1012-41ba-b810-c7f7f127330e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" Dec 05 20:35:35 crc kubenswrapper[4885]: I1205 20:35:35.826682 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" Dec 05 20:35:36 crc kubenswrapper[4885]: I1205 20:35:36.371816 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z"] Dec 05 20:35:36 crc kubenswrapper[4885]: I1205 20:35:36.408575 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" event={"ID":"d0a9ab2d-1012-41ba-b810-c7f7f127330e","Type":"ContainerStarted","Data":"348ad8537ba2d1dfd9c55fcee4cf06c45d44a95e746b7b50951c23acd4d1bd4a"} Dec 05 20:35:37 crc kubenswrapper[4885]: I1205 20:35:37.425248 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" event={"ID":"d0a9ab2d-1012-41ba-b810-c7f7f127330e","Type":"ContainerStarted","Data":"6acf817a78343f50b2bee1f2480b934490a4ad65a9c5c8bd28d9852aa56d8c0d"} Dec 05 20:35:37 crc kubenswrapper[4885]: I1205 20:35:37.445998 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" podStartSLOduration=1.997522808 podStartE2EDuration="2.445975549s" podCreationTimestamp="2025-12-05 20:35:35 +0000 UTC" firstStartedPulling="2025-12-05 20:35:36.377778761 +0000 UTC m=+1801.674594422" lastFinishedPulling="2025-12-05 20:35:36.826231452 +0000 UTC m=+1802.123047163" observedRunningTime="2025-12-05 20:35:37.439779946 +0000 UTC m=+1802.736595607" watchObservedRunningTime="2025-12-05 20:35:37.445975549 +0000 UTC m=+1802.742791210" Dec 05 20:35:38 crc kubenswrapper[4885]: I1205 20:35:38.056529 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-cn68w"] Dec 05 20:35:38 crc kubenswrapper[4885]: I1205 20:35:38.070239 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-e423-account-create-update-4n6sv"] Dec 05 20:35:38 crc kubenswrapper[4885]: I1205 20:35:38.080865 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-4tgwl"] Dec 05 20:35:38 crc kubenswrapper[4885]: I1205 20:35:38.091077 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-cn68w"] Dec 05 20:35:38 crc kubenswrapper[4885]: I1205 20:35:38.100857 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-e423-account-create-update-4n6sv"] Dec 05 20:35:38 crc kubenswrapper[4885]: I1205 20:35:38.109967 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-4tgwl"] Dec 05 20:35:39 crc kubenswrapper[4885]: I1205 20:35:39.031754 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-k4lkf"] Dec 05 20:35:39 crc kubenswrapper[4885]: I1205 20:35:39.067098 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-d424-account-create-update-jc98w"] Dec 05 20:35:39 crc kubenswrapper[4885]: I1205 20:35:39.081096 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-k4lkf"] Dec 05 20:35:39 crc kubenswrapper[4885]: I1205 20:35:39.090395 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7478-account-create-update-7whgg"] Dec 05 20:35:39 crc kubenswrapper[4885]: I1205 20:35:39.100137 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7478-account-create-update-7whgg"] Dec 05 20:35:39 crc kubenswrapper[4885]: I1205 20:35:39.107946 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-d424-account-create-update-jc98w"] Dec 05 20:35:39 crc kubenswrapper[4885]: I1205 20:35:39.194461 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="067e647c-7401-4dd7-9245-94d1675f1bb6" path="/var/lib/kubelet/pods/067e647c-7401-4dd7-9245-94d1675f1bb6/volumes" Dec 05 20:35:39 crc kubenswrapper[4885]: I1205 20:35:39.195286 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e7ef23d-578c-43d5-b7eb-a15cefb90d03" path="/var/lib/kubelet/pods/4e7ef23d-578c-43d5-b7eb-a15cefb90d03/volumes" Dec 05 20:35:39 crc kubenswrapper[4885]: I1205 20:35:39.195978 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="607fd1c0-165f-465f-bdd4-134ab3451a51" path="/var/lib/kubelet/pods/607fd1c0-165f-465f-bdd4-134ab3451a51/volumes" Dec 05 20:35:39 crc kubenswrapper[4885]: I1205 20:35:39.196567 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f88ff47-91a3-4fb9-9526-cc39661cbeec" path="/var/lib/kubelet/pods/8f88ff47-91a3-4fb9-9526-cc39661cbeec/volumes" Dec 05 20:35:39 crc kubenswrapper[4885]: I1205 20:35:39.197633 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91fa7e5b-9ed9-44de-bd54-105f4608ddb6" path="/var/lib/kubelet/pods/91fa7e5b-9ed9-44de-bd54-105f4608ddb6/volumes" Dec 05 20:35:39 crc kubenswrapper[4885]: I1205 20:35:39.198167 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed6ed529-d71f-4427-b906-ec6d3e9c33f0" path="/var/lib/kubelet/pods/ed6ed529-d71f-4427-b906-ec6d3e9c33f0/volumes" Dec 05 20:35:47 crc kubenswrapper[4885]: I1205 20:35:47.172155 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:35:47 crc kubenswrapper[4885]: E1205 20:35:47.173089 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:35:54 crc kubenswrapper[4885]: I1205 20:35:54.687004 4885 scope.go:117] "RemoveContainer" containerID="b657377210aa955017cbaa63c4b6f5cbdc53d16343057b6205a4890d542736e2" Dec 05 20:35:54 crc kubenswrapper[4885]: I1205 20:35:54.715117 4885 scope.go:117] "RemoveContainer" containerID="9f22a37060b3f6581bd4a3301e5abe5fd3875cbe8d16330efa711db01a8cf445" Dec 05 20:35:54 crc kubenswrapper[4885]: I1205 20:35:54.769297 4885 scope.go:117] "RemoveContainer" containerID="23832c1b8f6362618489f30ab3c7de95c873488a4ea5dddad673869c92c3c15e" Dec 05 20:35:54 crc kubenswrapper[4885]: I1205 20:35:54.803440 4885 scope.go:117] "RemoveContainer" containerID="159eb634e9bcc31b97dbcbf6020bb37f641eecf16ffcfce2f60ff4da5650b8c1" Dec 05 20:35:54 crc kubenswrapper[4885]: I1205 20:35:54.846329 4885 scope.go:117] "RemoveContainer" containerID="8330ce052621a2e1ea010ffed9518941ef80d224afab2b140f20682c4b33b5a5" Dec 05 20:35:54 crc kubenswrapper[4885]: I1205 20:35:54.883668 4885 scope.go:117] "RemoveContainer" containerID="d4ac867e8697362f04a9e106845ef431056cb0d2bf0cf8a5183b7a9c25b08545" Dec 05 20:35:54 crc kubenswrapper[4885]: I1205 20:35:54.931151 4885 scope.go:117] "RemoveContainer" containerID="499638b80cfea95c0f85dd0b05050fd6e7a749f7329e84bb4c6d967622a75e2b" Dec 05 20:35:58 crc kubenswrapper[4885]: I1205 20:35:58.173871 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:35:58 crc kubenswrapper[4885]: E1205 20:35:58.174463 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:36:04 crc kubenswrapper[4885]: I1205 20:36:04.062171 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rwl9m"] Dec 05 20:36:04 crc kubenswrapper[4885]: I1205 20:36:04.073302 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rwl9m"] Dec 05 20:36:05 crc kubenswrapper[4885]: I1205 20:36:05.194753 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28e595c7-7f03-4290-94df-5a3177b31c16" path="/var/lib/kubelet/pods/28e595c7-7f03-4290-94df-5a3177b31c16/volumes" Dec 05 20:36:12 crc kubenswrapper[4885]: I1205 20:36:12.172549 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:36:12 crc kubenswrapper[4885]: E1205 20:36:12.173596 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:36:17 crc kubenswrapper[4885]: I1205 20:36:17.860785 4885 generic.go:334] "Generic (PLEG): container finished" podID="d0a9ab2d-1012-41ba-b810-c7f7f127330e" containerID="6acf817a78343f50b2bee1f2480b934490a4ad65a9c5c8bd28d9852aa56d8c0d" exitCode=0 Dec 05 20:36:17 crc kubenswrapper[4885]: I1205 20:36:17.860952 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" event={"ID":"d0a9ab2d-1012-41ba-b810-c7f7f127330e","Type":"ContainerDied","Data":"6acf817a78343f50b2bee1f2480b934490a4ad65a9c5c8bd28d9852aa56d8c0d"} Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.328741 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.482145 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhwvd\" (UniqueName: \"kubernetes.io/projected/d0a9ab2d-1012-41ba-b810-c7f7f127330e-kube-api-access-rhwvd\") pod \"d0a9ab2d-1012-41ba-b810-c7f7f127330e\" (UID: \"d0a9ab2d-1012-41ba-b810-c7f7f127330e\") " Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.482209 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0a9ab2d-1012-41ba-b810-c7f7f127330e-inventory\") pod \"d0a9ab2d-1012-41ba-b810-c7f7f127330e\" (UID: \"d0a9ab2d-1012-41ba-b810-c7f7f127330e\") " Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.482442 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0a9ab2d-1012-41ba-b810-c7f7f127330e-ssh-key\") pod \"d0a9ab2d-1012-41ba-b810-c7f7f127330e\" (UID: \"d0a9ab2d-1012-41ba-b810-c7f7f127330e\") " Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.488157 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a9ab2d-1012-41ba-b810-c7f7f127330e-kube-api-access-rhwvd" (OuterVolumeSpecName: "kube-api-access-rhwvd") pod "d0a9ab2d-1012-41ba-b810-c7f7f127330e" (UID: "d0a9ab2d-1012-41ba-b810-c7f7f127330e"). InnerVolumeSpecName "kube-api-access-rhwvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.511237 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a9ab2d-1012-41ba-b810-c7f7f127330e-inventory" (OuterVolumeSpecName: "inventory") pod "d0a9ab2d-1012-41ba-b810-c7f7f127330e" (UID: "d0a9ab2d-1012-41ba-b810-c7f7f127330e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.517171 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a9ab2d-1012-41ba-b810-c7f7f127330e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d0a9ab2d-1012-41ba-b810-c7f7f127330e" (UID: "d0a9ab2d-1012-41ba-b810-c7f7f127330e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.585852 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhwvd\" (UniqueName: \"kubernetes.io/projected/d0a9ab2d-1012-41ba-b810-c7f7f127330e-kube-api-access-rhwvd\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.585893 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0a9ab2d-1012-41ba-b810-c7f7f127330e-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.585905 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0a9ab2d-1012-41ba-b810-c7f7f127330e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.885951 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" event={"ID":"d0a9ab2d-1012-41ba-b810-c7f7f127330e","Type":"ContainerDied","Data":"348ad8537ba2d1dfd9c55fcee4cf06c45d44a95e746b7b50951c23acd4d1bd4a"} Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.885991 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="348ad8537ba2d1dfd9c55fcee4cf06c45d44a95e746b7b50951c23acd4d1bd4a" Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.886116 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d6c5z" Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.992454 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp"] Dec 05 20:36:19 crc kubenswrapper[4885]: E1205 20:36:19.992989 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a9ab2d-1012-41ba-b810-c7f7f127330e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.993010 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a9ab2d-1012-41ba-b810-c7f7f127330e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.993213 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a9ab2d-1012-41ba-b810-c7f7f127330e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.993853 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.998540 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.998620 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.998544 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:36:19 crc kubenswrapper[4885]: I1205 20:36:19.999292 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:36:20 crc kubenswrapper[4885]: I1205 20:36:20.001653 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp"] Dec 05 20:36:20 crc kubenswrapper[4885]: I1205 20:36:20.095678 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz4gh\" (UniqueName: \"kubernetes.io/projected/9487fa66-920b-41fc-beb6-4dffcb4a898a-kube-api-access-jz4gh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-klnxp\" (UID: \"9487fa66-920b-41fc-beb6-4dffcb4a898a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" Dec 05 20:36:20 crc kubenswrapper[4885]: I1205 20:36:20.095819 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9487fa66-920b-41fc-beb6-4dffcb4a898a-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-klnxp\" (UID: \"9487fa66-920b-41fc-beb6-4dffcb4a898a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" Dec 05 20:36:20 crc kubenswrapper[4885]: I1205 20:36:20.095866 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9487fa66-920b-41fc-beb6-4dffcb4a898a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-klnxp\" (UID: \"9487fa66-920b-41fc-beb6-4dffcb4a898a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" Dec 05 20:36:20 crc kubenswrapper[4885]: I1205 20:36:20.197741 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz4gh\" (UniqueName: \"kubernetes.io/projected/9487fa66-920b-41fc-beb6-4dffcb4a898a-kube-api-access-jz4gh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-klnxp\" (UID: \"9487fa66-920b-41fc-beb6-4dffcb4a898a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" Dec 05 20:36:20 crc kubenswrapper[4885]: I1205 20:36:20.198044 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9487fa66-920b-41fc-beb6-4dffcb4a898a-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-klnxp\" (UID: \"9487fa66-920b-41fc-beb6-4dffcb4a898a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" Dec 05 20:36:20 crc kubenswrapper[4885]: I1205 20:36:20.198074 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9487fa66-920b-41fc-beb6-4dffcb4a898a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-klnxp\" (UID: \"9487fa66-920b-41fc-beb6-4dffcb4a898a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" Dec 05 20:36:20 crc kubenswrapper[4885]: I1205 20:36:20.202679 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9487fa66-920b-41fc-beb6-4dffcb4a898a-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-klnxp\" (UID: \"9487fa66-920b-41fc-beb6-4dffcb4a898a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" Dec 05 20:36:20 crc kubenswrapper[4885]: I1205 20:36:20.210483 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9487fa66-920b-41fc-beb6-4dffcb4a898a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-klnxp\" (UID: \"9487fa66-920b-41fc-beb6-4dffcb4a898a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" Dec 05 20:36:20 crc kubenswrapper[4885]: I1205 20:36:20.213555 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz4gh\" (UniqueName: \"kubernetes.io/projected/9487fa66-920b-41fc-beb6-4dffcb4a898a-kube-api-access-jz4gh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-klnxp\" (UID: \"9487fa66-920b-41fc-beb6-4dffcb4a898a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" Dec 05 20:36:20 crc kubenswrapper[4885]: I1205 20:36:20.347319 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" Dec 05 20:36:20 crc kubenswrapper[4885]: I1205 20:36:20.883121 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp"] Dec 05 20:36:20 crc kubenswrapper[4885]: W1205 20:36:20.883858 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9487fa66_920b_41fc_beb6_4dffcb4a898a.slice/crio-3947e60d8f3da77905aefa8fc4ee2d512e6a0636d1019b74298c0a2e0a73cf97 WatchSource:0}: Error finding container 3947e60d8f3da77905aefa8fc4ee2d512e6a0636d1019b74298c0a2e0a73cf97: Status 404 returned error can't find the container with id 3947e60d8f3da77905aefa8fc4ee2d512e6a0636d1019b74298c0a2e0a73cf97 Dec 05 20:36:20 crc kubenswrapper[4885]: I1205 20:36:20.896522 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" event={"ID":"9487fa66-920b-41fc-beb6-4dffcb4a898a","Type":"ContainerStarted","Data":"3947e60d8f3da77905aefa8fc4ee2d512e6a0636d1019b74298c0a2e0a73cf97"} Dec 05 20:36:21 crc kubenswrapper[4885]: I1205 20:36:21.911270 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" event={"ID":"9487fa66-920b-41fc-beb6-4dffcb4a898a","Type":"ContainerStarted","Data":"48455e1983e9c8b0f6d42855a0eec74ff4278c0e9364b5570a27f7c5081f0edd"} Dec 05 20:36:21 crc kubenswrapper[4885]: I1205 20:36:21.947227 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" podStartSLOduration=2.557549856 podStartE2EDuration="2.947206705s" podCreationTimestamp="2025-12-05 20:36:19 +0000 UTC" firstStartedPulling="2025-12-05 20:36:20.886147539 +0000 UTC m=+1846.182963190" lastFinishedPulling="2025-12-05 20:36:21.275804378 +0000 UTC m=+1846.572620039" observedRunningTime="2025-12-05 20:36:21.931558456 +0000 UTC m=+1847.228374157" watchObservedRunningTime="2025-12-05 20:36:21.947206705 +0000 UTC m=+1847.244022376" Dec 05 20:36:25 crc kubenswrapper[4885]: I1205 20:36:25.185319 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:36:25 crc kubenswrapper[4885]: I1205 20:36:25.955384 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"98812e64a2f367ecc2033031c0d3a29d3f95a9bab3a69de50ab7fd7e937cb70a"} Dec 05 20:36:26 crc kubenswrapper[4885]: I1205 20:36:26.064353 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-9qst6"] Dec 05 20:36:26 crc kubenswrapper[4885]: I1205 20:36:26.079634 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bv27r"] Dec 05 20:36:26 crc kubenswrapper[4885]: I1205 20:36:26.088646 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-9qst6"] Dec 05 20:36:26 crc kubenswrapper[4885]: I1205 20:36:26.096537 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bv27r"] Dec 05 20:36:27 crc kubenswrapper[4885]: I1205 20:36:27.183464 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad8c8f0f-88d1-4be1-8db0-882fac969fce" path="/var/lib/kubelet/pods/ad8c8f0f-88d1-4be1-8db0-882fac969fce/volumes" Dec 05 20:36:27 crc kubenswrapper[4885]: I1205 20:36:27.184554 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e31833f3-c584-4352-bf8c-03e18def1ea2" path="/var/lib/kubelet/pods/e31833f3-c584-4352-bf8c-03e18def1ea2/volumes" Dec 05 20:36:55 crc kubenswrapper[4885]: I1205 20:36:55.063319 4885 scope.go:117] "RemoveContainer" containerID="d6082b550762079a1f6e3eff5f27e3cb12fea5804fab76fa63d6d0513b45530d" Dec 05 20:36:55 crc kubenswrapper[4885]: I1205 20:36:55.144214 4885 scope.go:117] "RemoveContainer" containerID="de1fc11cb66d09131a6dc49a451f0dbe994f12c1e351512c7a70089e2b3da346" Dec 05 20:36:55 crc kubenswrapper[4885]: I1205 20:36:55.214929 4885 scope.go:117] "RemoveContainer" containerID="aed89c1191b75d1079c9c31b64b659a5796e3a63edfdbe2b5e6f4b582c388024" Dec 05 20:37:11 crc kubenswrapper[4885]: I1205 20:37:11.050014 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-lg8pc"] Dec 05 20:37:11 crc kubenswrapper[4885]: I1205 20:37:11.059044 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-lg8pc"] Dec 05 20:37:11 crc kubenswrapper[4885]: I1205 20:37:11.184392 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a7f1297-73c8-4b59-99c9-386d4b5483a1" path="/var/lib/kubelet/pods/7a7f1297-73c8-4b59-99c9-386d4b5483a1/volumes" Dec 05 20:37:17 crc kubenswrapper[4885]: I1205 20:37:17.525926 4885 generic.go:334] "Generic (PLEG): container finished" podID="9487fa66-920b-41fc-beb6-4dffcb4a898a" containerID="48455e1983e9c8b0f6d42855a0eec74ff4278c0e9364b5570a27f7c5081f0edd" exitCode=0 Dec 05 20:37:17 crc kubenswrapper[4885]: I1205 20:37:17.526041 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" event={"ID":"9487fa66-920b-41fc-beb6-4dffcb4a898a","Type":"ContainerDied","Data":"48455e1983e9c8b0f6d42855a0eec74ff4278c0e9364b5570a27f7c5081f0edd"} Dec 05 20:37:18 crc kubenswrapper[4885]: I1205 20:37:18.957697 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.125674 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9487fa66-920b-41fc-beb6-4dffcb4a898a-ssh-key\") pod \"9487fa66-920b-41fc-beb6-4dffcb4a898a\" (UID: \"9487fa66-920b-41fc-beb6-4dffcb4a898a\") " Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.125885 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9487fa66-920b-41fc-beb6-4dffcb4a898a-inventory\") pod \"9487fa66-920b-41fc-beb6-4dffcb4a898a\" (UID: \"9487fa66-920b-41fc-beb6-4dffcb4a898a\") " Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.125915 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz4gh\" (UniqueName: \"kubernetes.io/projected/9487fa66-920b-41fc-beb6-4dffcb4a898a-kube-api-access-jz4gh\") pod \"9487fa66-920b-41fc-beb6-4dffcb4a898a\" (UID: \"9487fa66-920b-41fc-beb6-4dffcb4a898a\") " Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.138370 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9487fa66-920b-41fc-beb6-4dffcb4a898a-kube-api-access-jz4gh" (OuterVolumeSpecName: "kube-api-access-jz4gh") pod "9487fa66-920b-41fc-beb6-4dffcb4a898a" (UID: "9487fa66-920b-41fc-beb6-4dffcb4a898a"). InnerVolumeSpecName "kube-api-access-jz4gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.182792 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9487fa66-920b-41fc-beb6-4dffcb4a898a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9487fa66-920b-41fc-beb6-4dffcb4a898a" (UID: "9487fa66-920b-41fc-beb6-4dffcb4a898a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.197346 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9487fa66-920b-41fc-beb6-4dffcb4a898a-inventory" (OuterVolumeSpecName: "inventory") pod "9487fa66-920b-41fc-beb6-4dffcb4a898a" (UID: "9487fa66-920b-41fc-beb6-4dffcb4a898a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.227993 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9487fa66-920b-41fc-beb6-4dffcb4a898a-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.228043 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz4gh\" (UniqueName: \"kubernetes.io/projected/9487fa66-920b-41fc-beb6-4dffcb4a898a-kube-api-access-jz4gh\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.228176 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9487fa66-920b-41fc-beb6-4dffcb4a898a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.549844 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" event={"ID":"9487fa66-920b-41fc-beb6-4dffcb4a898a","Type":"ContainerDied","Data":"3947e60d8f3da77905aefa8fc4ee2d512e6a0636d1019b74298c0a2e0a73cf97"} Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.549892 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3947e60d8f3da77905aefa8fc4ee2d512e6a0636d1019b74298c0a2e0a73cf97" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.549933 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-klnxp" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.656481 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9jdng"] Dec 05 20:37:19 crc kubenswrapper[4885]: E1205 20:37:19.657145 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9487fa66-920b-41fc-beb6-4dffcb4a898a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.657170 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9487fa66-920b-41fc-beb6-4dffcb4a898a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.657501 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9487fa66-920b-41fc-beb6-4dffcb4a898a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.658453 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.662975 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.663216 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.664085 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.670724 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.671304 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9jdng"] Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.739313 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f98cb4a-349f-443b-aab3-686a3d0bcc67-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9jdng\" (UID: \"8f98cb4a-349f-443b-aab3-686a3d0bcc67\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.739388 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8f98cb4a-349f-443b-aab3-686a3d0bcc67-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9jdng\" (UID: \"8f98cb4a-349f-443b-aab3-686a3d0bcc67\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.739436 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgwtw\" (UniqueName: \"kubernetes.io/projected/8f98cb4a-349f-443b-aab3-686a3d0bcc67-kube-api-access-hgwtw\") pod \"ssh-known-hosts-edpm-deployment-9jdng\" (UID: \"8f98cb4a-349f-443b-aab3-686a3d0bcc67\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.840899 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f98cb4a-349f-443b-aab3-686a3d0bcc67-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9jdng\" (UID: \"8f98cb4a-349f-443b-aab3-686a3d0bcc67\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.841196 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8f98cb4a-349f-443b-aab3-686a3d0bcc67-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9jdng\" (UID: \"8f98cb4a-349f-443b-aab3-686a3d0bcc67\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.841295 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgwtw\" (UniqueName: \"kubernetes.io/projected/8f98cb4a-349f-443b-aab3-686a3d0bcc67-kube-api-access-hgwtw\") pod \"ssh-known-hosts-edpm-deployment-9jdng\" (UID: \"8f98cb4a-349f-443b-aab3-686a3d0bcc67\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.846501 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8f98cb4a-349f-443b-aab3-686a3d0bcc67-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9jdng\" (UID: \"8f98cb4a-349f-443b-aab3-686a3d0bcc67\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.847483 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f98cb4a-349f-443b-aab3-686a3d0bcc67-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9jdng\" (UID: \"8f98cb4a-349f-443b-aab3-686a3d0bcc67\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.861559 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgwtw\" (UniqueName: \"kubernetes.io/projected/8f98cb4a-349f-443b-aab3-686a3d0bcc67-kube-api-access-hgwtw\") pod \"ssh-known-hosts-edpm-deployment-9jdng\" (UID: \"8f98cb4a-349f-443b-aab3-686a3d0bcc67\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" Dec 05 20:37:19 crc kubenswrapper[4885]: I1205 20:37:19.982733 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" Dec 05 20:37:20 crc kubenswrapper[4885]: I1205 20:37:20.511532 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9jdng"] Dec 05 20:37:20 crc kubenswrapper[4885]: I1205 20:37:20.559439 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" event={"ID":"8f98cb4a-349f-443b-aab3-686a3d0bcc67","Type":"ContainerStarted","Data":"45479c43a179ea48b1af9733106502c5577a690415e48ea4b75819c3fc9ff3aa"} Dec 05 20:37:21 crc kubenswrapper[4885]: I1205 20:37:21.583280 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" event={"ID":"8f98cb4a-349f-443b-aab3-686a3d0bcc67","Type":"ContainerStarted","Data":"a9c51d38c71a610d7d28578b6379bf20072aa09a504f18542c543dc1521d1f9b"} Dec 05 20:37:21 crc kubenswrapper[4885]: I1205 20:37:21.603910 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" podStartSLOduration=2.178884787 podStartE2EDuration="2.60388845s" podCreationTimestamp="2025-12-05 20:37:19 +0000 UTC" firstStartedPulling="2025-12-05 20:37:20.517043998 +0000 UTC m=+1905.813859669" lastFinishedPulling="2025-12-05 20:37:20.942047661 +0000 UTC m=+1906.238863332" observedRunningTime="2025-12-05 20:37:21.60232727 +0000 UTC m=+1906.899142931" watchObservedRunningTime="2025-12-05 20:37:21.60388845 +0000 UTC m=+1906.900704111" Dec 05 20:37:21 crc kubenswrapper[4885]: I1205 20:37:21.886074 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2hdk7"] Dec 05 20:37:21 crc kubenswrapper[4885]: I1205 20:37:21.887746 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:21 crc kubenswrapper[4885]: I1205 20:37:21.895765 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hdk7"] Dec 05 20:37:22 crc kubenswrapper[4885]: I1205 20:37:22.085130 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phbgl\" (UniqueName: \"kubernetes.io/projected/57a55ee5-f76d-421c-acb1-034c9dbf2d05-kube-api-access-phbgl\") pod \"redhat-marketplace-2hdk7\" (UID: \"57a55ee5-f76d-421c-acb1-034c9dbf2d05\") " pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:22 crc kubenswrapper[4885]: I1205 20:37:22.085695 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a55ee5-f76d-421c-acb1-034c9dbf2d05-catalog-content\") pod \"redhat-marketplace-2hdk7\" (UID: \"57a55ee5-f76d-421c-acb1-034c9dbf2d05\") " pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:22 crc kubenswrapper[4885]: I1205 20:37:22.085836 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a55ee5-f76d-421c-acb1-034c9dbf2d05-utilities\") pod \"redhat-marketplace-2hdk7\" (UID: \"57a55ee5-f76d-421c-acb1-034c9dbf2d05\") " pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:22 crc kubenswrapper[4885]: I1205 20:37:22.187072 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phbgl\" (UniqueName: \"kubernetes.io/projected/57a55ee5-f76d-421c-acb1-034c9dbf2d05-kube-api-access-phbgl\") pod \"redhat-marketplace-2hdk7\" (UID: \"57a55ee5-f76d-421c-acb1-034c9dbf2d05\") " pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:22 crc kubenswrapper[4885]: I1205 20:37:22.187565 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a55ee5-f76d-421c-acb1-034c9dbf2d05-catalog-content\") pod \"redhat-marketplace-2hdk7\" (UID: \"57a55ee5-f76d-421c-acb1-034c9dbf2d05\") " pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:22 crc kubenswrapper[4885]: I1205 20:37:22.187815 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a55ee5-f76d-421c-acb1-034c9dbf2d05-utilities\") pod \"redhat-marketplace-2hdk7\" (UID: \"57a55ee5-f76d-421c-acb1-034c9dbf2d05\") " pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:22 crc kubenswrapper[4885]: I1205 20:37:22.188222 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a55ee5-f76d-421c-acb1-034c9dbf2d05-utilities\") pod \"redhat-marketplace-2hdk7\" (UID: \"57a55ee5-f76d-421c-acb1-034c9dbf2d05\") " pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:22 crc kubenswrapper[4885]: I1205 20:37:22.188341 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a55ee5-f76d-421c-acb1-034c9dbf2d05-catalog-content\") pod \"redhat-marketplace-2hdk7\" (UID: \"57a55ee5-f76d-421c-acb1-034c9dbf2d05\") " pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:22 crc kubenswrapper[4885]: I1205 20:37:22.211149 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phbgl\" (UniqueName: \"kubernetes.io/projected/57a55ee5-f76d-421c-acb1-034c9dbf2d05-kube-api-access-phbgl\") pod \"redhat-marketplace-2hdk7\" (UID: \"57a55ee5-f76d-421c-acb1-034c9dbf2d05\") " pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:22 crc kubenswrapper[4885]: I1205 20:37:22.270172 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:22 crc kubenswrapper[4885]: I1205 20:37:22.823105 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hdk7"] Dec 05 20:37:22 crc kubenswrapper[4885]: W1205 20:37:22.829332 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a55ee5_f76d_421c_acb1_034c9dbf2d05.slice/crio-36f42dafcc642e5ffd1a7e50324b3629abde3abcab181ab77dfeb0762cbb2167 WatchSource:0}: Error finding container 36f42dafcc642e5ffd1a7e50324b3629abde3abcab181ab77dfeb0762cbb2167: Status 404 returned error can't find the container with id 36f42dafcc642e5ffd1a7e50324b3629abde3abcab181ab77dfeb0762cbb2167 Dec 05 20:37:23 crc kubenswrapper[4885]: I1205 20:37:23.611937 4885 generic.go:334] "Generic (PLEG): container finished" podID="57a55ee5-f76d-421c-acb1-034c9dbf2d05" containerID="3bb54bdd176e982196eaa7374720d4bb8b07ac4c617281eca26150491000cae2" exitCode=0 Dec 05 20:37:23 crc kubenswrapper[4885]: I1205 20:37:23.612258 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hdk7" event={"ID":"57a55ee5-f76d-421c-acb1-034c9dbf2d05","Type":"ContainerDied","Data":"3bb54bdd176e982196eaa7374720d4bb8b07ac4c617281eca26150491000cae2"} Dec 05 20:37:23 crc kubenswrapper[4885]: I1205 20:37:23.612279 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hdk7" event={"ID":"57a55ee5-f76d-421c-acb1-034c9dbf2d05","Type":"ContainerStarted","Data":"36f42dafcc642e5ffd1a7e50324b3629abde3abcab181ab77dfeb0762cbb2167"} Dec 05 20:37:24 crc kubenswrapper[4885]: I1205 20:37:24.627092 4885 generic.go:334] "Generic (PLEG): container finished" podID="57a55ee5-f76d-421c-acb1-034c9dbf2d05" containerID="52a445648bf7bc0fed832b15f3aae1d2614f9529378c87db93b44c1d837ba046" exitCode=0 Dec 05 20:37:24 crc kubenswrapper[4885]: I1205 20:37:24.627142 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hdk7" event={"ID":"57a55ee5-f76d-421c-acb1-034c9dbf2d05","Type":"ContainerDied","Data":"52a445648bf7bc0fed832b15f3aae1d2614f9529378c87db93b44c1d837ba046"} Dec 05 20:37:25 crc kubenswrapper[4885]: I1205 20:37:25.636494 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hdk7" event={"ID":"57a55ee5-f76d-421c-acb1-034c9dbf2d05","Type":"ContainerStarted","Data":"9d55015b0320f7cc217a89735bbc562c96e7eba9cc3e9ec9676903bb87e8ad1b"} Dec 05 20:37:25 crc kubenswrapper[4885]: I1205 20:37:25.660108 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2hdk7" podStartSLOduration=3.185412869 podStartE2EDuration="4.660085112s" podCreationTimestamp="2025-12-05 20:37:21 +0000 UTC" firstStartedPulling="2025-12-05 20:37:23.613692394 +0000 UTC m=+1908.910508055" lastFinishedPulling="2025-12-05 20:37:25.088364637 +0000 UTC m=+1910.385180298" observedRunningTime="2025-12-05 20:37:25.657444599 +0000 UTC m=+1910.954260260" watchObservedRunningTime="2025-12-05 20:37:25.660085112 +0000 UTC m=+1910.956900783" Dec 05 20:37:28 crc kubenswrapper[4885]: I1205 20:37:28.662362 4885 generic.go:334] "Generic (PLEG): container finished" podID="8f98cb4a-349f-443b-aab3-686a3d0bcc67" containerID="a9c51d38c71a610d7d28578b6379bf20072aa09a504f18542c543dc1521d1f9b" exitCode=0 Dec 05 20:37:28 crc kubenswrapper[4885]: I1205 20:37:28.662449 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" event={"ID":"8f98cb4a-349f-443b-aab3-686a3d0bcc67","Type":"ContainerDied","Data":"a9c51d38c71a610d7d28578b6379bf20072aa09a504f18542c543dc1521d1f9b"} Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.111294 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.178817 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f98cb4a-349f-443b-aab3-686a3d0bcc67-ssh-key-openstack-edpm-ipam\") pod \"8f98cb4a-349f-443b-aab3-686a3d0bcc67\" (UID: \"8f98cb4a-349f-443b-aab3-686a3d0bcc67\") " Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.178931 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8f98cb4a-349f-443b-aab3-686a3d0bcc67-inventory-0\") pod \"8f98cb4a-349f-443b-aab3-686a3d0bcc67\" (UID: \"8f98cb4a-349f-443b-aab3-686a3d0bcc67\") " Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.178966 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgwtw\" (UniqueName: \"kubernetes.io/projected/8f98cb4a-349f-443b-aab3-686a3d0bcc67-kube-api-access-hgwtw\") pod \"8f98cb4a-349f-443b-aab3-686a3d0bcc67\" (UID: \"8f98cb4a-349f-443b-aab3-686a3d0bcc67\") " Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.183579 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f98cb4a-349f-443b-aab3-686a3d0bcc67-kube-api-access-hgwtw" (OuterVolumeSpecName: "kube-api-access-hgwtw") pod "8f98cb4a-349f-443b-aab3-686a3d0bcc67" (UID: "8f98cb4a-349f-443b-aab3-686a3d0bcc67"). InnerVolumeSpecName "kube-api-access-hgwtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.205218 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f98cb4a-349f-443b-aab3-686a3d0bcc67-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "8f98cb4a-349f-443b-aab3-686a3d0bcc67" (UID: "8f98cb4a-349f-443b-aab3-686a3d0bcc67"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.209093 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f98cb4a-349f-443b-aab3-686a3d0bcc67-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8f98cb4a-349f-443b-aab3-686a3d0bcc67" (UID: "8f98cb4a-349f-443b-aab3-686a3d0bcc67"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.281652 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f98cb4a-349f-443b-aab3-686a3d0bcc67-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.281706 4885 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8f98cb4a-349f-443b-aab3-686a3d0bcc67-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.281725 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgwtw\" (UniqueName: \"kubernetes.io/projected/8f98cb4a-349f-443b-aab3-686a3d0bcc67-kube-api-access-hgwtw\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.693463 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" event={"ID":"8f98cb4a-349f-443b-aab3-686a3d0bcc67","Type":"ContainerDied","Data":"45479c43a179ea48b1af9733106502c5577a690415e48ea4b75819c3fc9ff3aa"} Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.693521 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9jdng" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.693543 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45479c43a179ea48b1af9733106502c5577a690415e48ea4b75819c3fc9ff3aa" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.790464 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8"] Dec 05 20:37:30 crc kubenswrapper[4885]: E1205 20:37:30.791324 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f98cb4a-349f-443b-aab3-686a3d0bcc67" containerName="ssh-known-hosts-edpm-deployment" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.791361 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f98cb4a-349f-443b-aab3-686a3d0bcc67" containerName="ssh-known-hosts-edpm-deployment" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.791803 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f98cb4a-349f-443b-aab3-686a3d0bcc67" containerName="ssh-known-hosts-edpm-deployment" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.792835 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.795242 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.795472 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.795940 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.807740 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8"] Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.824477 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.892967 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt4g4\" (UniqueName: \"kubernetes.io/projected/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-kube-api-access-kt4g4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ptcp8\" (UID: \"59678b29-6ffe-4d18-a8bb-8bf4717f9b10\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.893081 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ptcp8\" (UID: \"59678b29-6ffe-4d18-a8bb-8bf4717f9b10\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.893139 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ptcp8\" (UID: \"59678b29-6ffe-4d18-a8bb-8bf4717f9b10\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.995562 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ptcp8\" (UID: \"59678b29-6ffe-4d18-a8bb-8bf4717f9b10\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.995744 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ptcp8\" (UID: \"59678b29-6ffe-4d18-a8bb-8bf4717f9b10\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" Dec 05 20:37:30 crc kubenswrapper[4885]: I1205 20:37:30.996049 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt4g4\" (UniqueName: \"kubernetes.io/projected/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-kube-api-access-kt4g4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ptcp8\" (UID: \"59678b29-6ffe-4d18-a8bb-8bf4717f9b10\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" Dec 05 20:37:31 crc kubenswrapper[4885]: I1205 20:37:31.001807 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ptcp8\" (UID: \"59678b29-6ffe-4d18-a8bb-8bf4717f9b10\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" Dec 05 20:37:31 crc kubenswrapper[4885]: I1205 20:37:31.010747 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ptcp8\" (UID: \"59678b29-6ffe-4d18-a8bb-8bf4717f9b10\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" Dec 05 20:37:31 crc kubenswrapper[4885]: I1205 20:37:31.015407 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt4g4\" (UniqueName: \"kubernetes.io/projected/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-kube-api-access-kt4g4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ptcp8\" (UID: \"59678b29-6ffe-4d18-a8bb-8bf4717f9b10\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" Dec 05 20:37:31 crc kubenswrapper[4885]: I1205 20:37:31.157236 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" Dec 05 20:37:31 crc kubenswrapper[4885]: I1205 20:37:31.738115 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8"] Dec 05 20:37:32 crc kubenswrapper[4885]: I1205 20:37:32.271052 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:32 crc kubenswrapper[4885]: I1205 20:37:32.271175 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:32 crc kubenswrapper[4885]: I1205 20:37:32.386636 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:32 crc kubenswrapper[4885]: I1205 20:37:32.714178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" event={"ID":"59678b29-6ffe-4d18-a8bb-8bf4717f9b10","Type":"ContainerStarted","Data":"5371bddcc1f4566e635cc802e8826f7cd3c0be8852f67fb413fef7ca648c719e"} Dec 05 20:37:32 crc kubenswrapper[4885]: I1205 20:37:32.797530 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:33 crc kubenswrapper[4885]: I1205 20:37:33.728415 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" event={"ID":"59678b29-6ffe-4d18-a8bb-8bf4717f9b10","Type":"ContainerStarted","Data":"7be9f1550ee3ffa19dfd3fc3ea48df67d05fc7ade2006a1fb3ad09802a7d1eef"} Dec 05 20:37:33 crc kubenswrapper[4885]: I1205 20:37:33.749726 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" podStartSLOduration=2.437116843 podStartE2EDuration="3.749704854s" podCreationTimestamp="2025-12-05 20:37:30 +0000 UTC" firstStartedPulling="2025-12-05 20:37:31.750469709 +0000 UTC m=+1917.047285370" lastFinishedPulling="2025-12-05 20:37:33.0630577 +0000 UTC m=+1918.359873381" observedRunningTime="2025-12-05 20:37:33.748921579 +0000 UTC m=+1919.045737260" watchObservedRunningTime="2025-12-05 20:37:33.749704854 +0000 UTC m=+1919.046520515" Dec 05 20:37:35 crc kubenswrapper[4885]: I1205 20:37:35.860052 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hdk7"] Dec 05 20:37:35 crc kubenswrapper[4885]: I1205 20:37:35.860990 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2hdk7" podUID="57a55ee5-f76d-421c-acb1-034c9dbf2d05" containerName="registry-server" containerID="cri-o://9d55015b0320f7cc217a89735bbc562c96e7eba9cc3e9ec9676903bb87e8ad1b" gracePeriod=2 Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.454135 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.569792 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a55ee5-f76d-421c-acb1-034c9dbf2d05-utilities\") pod \"57a55ee5-f76d-421c-acb1-034c9dbf2d05\" (UID: \"57a55ee5-f76d-421c-acb1-034c9dbf2d05\") " Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.570171 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a55ee5-f76d-421c-acb1-034c9dbf2d05-catalog-content\") pod \"57a55ee5-f76d-421c-acb1-034c9dbf2d05\" (UID: \"57a55ee5-f76d-421c-acb1-034c9dbf2d05\") " Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.570333 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phbgl\" (UniqueName: \"kubernetes.io/projected/57a55ee5-f76d-421c-acb1-034c9dbf2d05-kube-api-access-phbgl\") pod \"57a55ee5-f76d-421c-acb1-034c9dbf2d05\" (UID: \"57a55ee5-f76d-421c-acb1-034c9dbf2d05\") " Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.571610 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a55ee5-f76d-421c-acb1-034c9dbf2d05-utilities" (OuterVolumeSpecName: "utilities") pod "57a55ee5-f76d-421c-acb1-034c9dbf2d05" (UID: "57a55ee5-f76d-421c-acb1-034c9dbf2d05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.580398 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a55ee5-f76d-421c-acb1-034c9dbf2d05-kube-api-access-phbgl" (OuterVolumeSpecName: "kube-api-access-phbgl") pod "57a55ee5-f76d-421c-acb1-034c9dbf2d05" (UID: "57a55ee5-f76d-421c-acb1-034c9dbf2d05"). InnerVolumeSpecName "kube-api-access-phbgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.592097 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a55ee5-f76d-421c-acb1-034c9dbf2d05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a55ee5-f76d-421c-acb1-034c9dbf2d05" (UID: "57a55ee5-f76d-421c-acb1-034c9dbf2d05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.672845 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a55ee5-f76d-421c-acb1-034c9dbf2d05-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.673246 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a55ee5-f76d-421c-acb1-034c9dbf2d05-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.673426 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phbgl\" (UniqueName: \"kubernetes.io/projected/57a55ee5-f76d-421c-acb1-034c9dbf2d05-kube-api-access-phbgl\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.757665 4885 generic.go:334] "Generic (PLEG): container finished" podID="57a55ee5-f76d-421c-acb1-034c9dbf2d05" containerID="9d55015b0320f7cc217a89735bbc562c96e7eba9cc3e9ec9676903bb87e8ad1b" exitCode=0 Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.757717 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hdk7" event={"ID":"57a55ee5-f76d-421c-acb1-034c9dbf2d05","Type":"ContainerDied","Data":"9d55015b0320f7cc217a89735bbc562c96e7eba9cc3e9ec9676903bb87e8ad1b"} Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.757744 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hdk7" event={"ID":"57a55ee5-f76d-421c-acb1-034c9dbf2d05","Type":"ContainerDied","Data":"36f42dafcc642e5ffd1a7e50324b3629abde3abcab181ab77dfeb0762cbb2167"} Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.757751 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hdk7" Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.757760 4885 scope.go:117] "RemoveContainer" containerID="9d55015b0320f7cc217a89735bbc562c96e7eba9cc3e9ec9676903bb87e8ad1b" Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.795091 4885 scope.go:117] "RemoveContainer" containerID="52a445648bf7bc0fed832b15f3aae1d2614f9529378c87db93b44c1d837ba046" Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.799057 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hdk7"] Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.812485 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hdk7"] Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.824354 4885 scope.go:117] "RemoveContainer" containerID="3bb54bdd176e982196eaa7374720d4bb8b07ac4c617281eca26150491000cae2" Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.874881 4885 scope.go:117] "RemoveContainer" containerID="9d55015b0320f7cc217a89735bbc562c96e7eba9cc3e9ec9676903bb87e8ad1b" Dec 05 20:37:36 crc kubenswrapper[4885]: E1205 20:37:36.876211 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d55015b0320f7cc217a89735bbc562c96e7eba9cc3e9ec9676903bb87e8ad1b\": container with ID starting with 9d55015b0320f7cc217a89735bbc562c96e7eba9cc3e9ec9676903bb87e8ad1b not found: ID does not exist" containerID="9d55015b0320f7cc217a89735bbc562c96e7eba9cc3e9ec9676903bb87e8ad1b" Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.876243 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d55015b0320f7cc217a89735bbc562c96e7eba9cc3e9ec9676903bb87e8ad1b"} err="failed to get container status \"9d55015b0320f7cc217a89735bbc562c96e7eba9cc3e9ec9676903bb87e8ad1b\": rpc error: code = NotFound desc = could not find container \"9d55015b0320f7cc217a89735bbc562c96e7eba9cc3e9ec9676903bb87e8ad1b\": container with ID starting with 9d55015b0320f7cc217a89735bbc562c96e7eba9cc3e9ec9676903bb87e8ad1b not found: ID does not exist" Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.876263 4885 scope.go:117] "RemoveContainer" containerID="52a445648bf7bc0fed832b15f3aae1d2614f9529378c87db93b44c1d837ba046" Dec 05 20:37:36 crc kubenswrapper[4885]: E1205 20:37:36.876543 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52a445648bf7bc0fed832b15f3aae1d2614f9529378c87db93b44c1d837ba046\": container with ID starting with 52a445648bf7bc0fed832b15f3aae1d2614f9529378c87db93b44c1d837ba046 not found: ID does not exist" containerID="52a445648bf7bc0fed832b15f3aae1d2614f9529378c87db93b44c1d837ba046" Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.876566 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52a445648bf7bc0fed832b15f3aae1d2614f9529378c87db93b44c1d837ba046"} err="failed to get container status \"52a445648bf7bc0fed832b15f3aae1d2614f9529378c87db93b44c1d837ba046\": rpc error: code = NotFound desc = could not find container \"52a445648bf7bc0fed832b15f3aae1d2614f9529378c87db93b44c1d837ba046\": container with ID starting with 52a445648bf7bc0fed832b15f3aae1d2614f9529378c87db93b44c1d837ba046 not found: ID does not exist" Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.876578 4885 scope.go:117] "RemoveContainer" containerID="3bb54bdd176e982196eaa7374720d4bb8b07ac4c617281eca26150491000cae2" Dec 05 20:37:36 crc kubenswrapper[4885]: E1205 20:37:36.876810 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bb54bdd176e982196eaa7374720d4bb8b07ac4c617281eca26150491000cae2\": container with ID starting with 3bb54bdd176e982196eaa7374720d4bb8b07ac4c617281eca26150491000cae2 not found: ID does not exist" containerID="3bb54bdd176e982196eaa7374720d4bb8b07ac4c617281eca26150491000cae2" Dec 05 20:37:36 crc kubenswrapper[4885]: I1205 20:37:36.876831 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bb54bdd176e982196eaa7374720d4bb8b07ac4c617281eca26150491000cae2"} err="failed to get container status \"3bb54bdd176e982196eaa7374720d4bb8b07ac4c617281eca26150491000cae2\": rpc error: code = NotFound desc = could not find container \"3bb54bdd176e982196eaa7374720d4bb8b07ac4c617281eca26150491000cae2\": container with ID starting with 3bb54bdd176e982196eaa7374720d4bb8b07ac4c617281eca26150491000cae2 not found: ID does not exist" Dec 05 20:37:37 crc kubenswrapper[4885]: I1205 20:37:37.185534 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a55ee5-f76d-421c-acb1-034c9dbf2d05" path="/var/lib/kubelet/pods/57a55ee5-f76d-421c-acb1-034c9dbf2d05/volumes" Dec 05 20:37:42 crc kubenswrapper[4885]: I1205 20:37:42.824323 4885 generic.go:334] "Generic (PLEG): container finished" podID="59678b29-6ffe-4d18-a8bb-8bf4717f9b10" containerID="7be9f1550ee3ffa19dfd3fc3ea48df67d05fc7ade2006a1fb3ad09802a7d1eef" exitCode=0 Dec 05 20:37:42 crc kubenswrapper[4885]: I1205 20:37:42.824447 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" event={"ID":"59678b29-6ffe-4d18-a8bb-8bf4717f9b10","Type":"ContainerDied","Data":"7be9f1550ee3ffa19dfd3fc3ea48df67d05fc7ade2006a1fb3ad09802a7d1eef"} Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.297671 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.421996 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-inventory\") pod \"59678b29-6ffe-4d18-a8bb-8bf4717f9b10\" (UID: \"59678b29-6ffe-4d18-a8bb-8bf4717f9b10\") " Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.422435 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-ssh-key\") pod \"59678b29-6ffe-4d18-a8bb-8bf4717f9b10\" (UID: \"59678b29-6ffe-4d18-a8bb-8bf4717f9b10\") " Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.422503 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt4g4\" (UniqueName: \"kubernetes.io/projected/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-kube-api-access-kt4g4\") pod \"59678b29-6ffe-4d18-a8bb-8bf4717f9b10\" (UID: \"59678b29-6ffe-4d18-a8bb-8bf4717f9b10\") " Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.427253 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-kube-api-access-kt4g4" (OuterVolumeSpecName: "kube-api-access-kt4g4") pod "59678b29-6ffe-4d18-a8bb-8bf4717f9b10" (UID: "59678b29-6ffe-4d18-a8bb-8bf4717f9b10"). InnerVolumeSpecName "kube-api-access-kt4g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.462200 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-inventory" (OuterVolumeSpecName: "inventory") pod "59678b29-6ffe-4d18-a8bb-8bf4717f9b10" (UID: "59678b29-6ffe-4d18-a8bb-8bf4717f9b10"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.481395 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "59678b29-6ffe-4d18-a8bb-8bf4717f9b10" (UID: "59678b29-6ffe-4d18-a8bb-8bf4717f9b10"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.525728 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.525792 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.525814 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt4g4\" (UniqueName: \"kubernetes.io/projected/59678b29-6ffe-4d18-a8bb-8bf4717f9b10-kube-api-access-kt4g4\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.855183 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" event={"ID":"59678b29-6ffe-4d18-a8bb-8bf4717f9b10","Type":"ContainerDied","Data":"5371bddcc1f4566e635cc802e8826f7cd3c0be8852f67fb413fef7ca648c719e"} Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.855231 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5371bddcc1f4566e635cc802e8826f7cd3c0be8852f67fb413fef7ca648c719e" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.855275 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ptcp8" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.948182 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk"] Dec 05 20:37:44 crc kubenswrapper[4885]: E1205 20:37:44.948625 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a55ee5-f76d-421c-acb1-034c9dbf2d05" containerName="extract-content" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.948648 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a55ee5-f76d-421c-acb1-034c9dbf2d05" containerName="extract-content" Dec 05 20:37:44 crc kubenswrapper[4885]: E1205 20:37:44.948669 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59678b29-6ffe-4d18-a8bb-8bf4717f9b10" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.948679 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="59678b29-6ffe-4d18-a8bb-8bf4717f9b10" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:37:44 crc kubenswrapper[4885]: E1205 20:37:44.948704 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a55ee5-f76d-421c-acb1-034c9dbf2d05" containerName="registry-server" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.948712 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a55ee5-f76d-421c-acb1-034c9dbf2d05" containerName="registry-server" Dec 05 20:37:44 crc kubenswrapper[4885]: E1205 20:37:44.948747 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a55ee5-f76d-421c-acb1-034c9dbf2d05" containerName="extract-utilities" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.948756 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a55ee5-f76d-421c-acb1-034c9dbf2d05" containerName="extract-utilities" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.948964 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a55ee5-f76d-421c-acb1-034c9dbf2d05" containerName="registry-server" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.949010 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="59678b29-6ffe-4d18-a8bb-8bf4717f9b10" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.949791 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.952489 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.952713 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.952745 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.953292 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:37:44 crc kubenswrapper[4885]: I1205 20:37:44.958649 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk"] Dec 05 20:37:45 crc kubenswrapper[4885]: I1205 20:37:45.036026 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc279\" (UniqueName: \"kubernetes.io/projected/b27a1f4c-ba65-4b22-885a-e642064f7c27-kube-api-access-wc279\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk\" (UID: \"b27a1f4c-ba65-4b22-885a-e642064f7c27\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" Dec 05 20:37:45 crc kubenswrapper[4885]: I1205 20:37:45.036145 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b27a1f4c-ba65-4b22-885a-e642064f7c27-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk\" (UID: \"b27a1f4c-ba65-4b22-885a-e642064f7c27\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" Dec 05 20:37:45 crc kubenswrapper[4885]: I1205 20:37:45.036171 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b27a1f4c-ba65-4b22-885a-e642064f7c27-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk\" (UID: \"b27a1f4c-ba65-4b22-885a-e642064f7c27\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" Dec 05 20:37:45 crc kubenswrapper[4885]: I1205 20:37:45.138809 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b27a1f4c-ba65-4b22-885a-e642064f7c27-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk\" (UID: \"b27a1f4c-ba65-4b22-885a-e642064f7c27\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" Dec 05 20:37:45 crc kubenswrapper[4885]: I1205 20:37:45.138865 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b27a1f4c-ba65-4b22-885a-e642064f7c27-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk\" (UID: \"b27a1f4c-ba65-4b22-885a-e642064f7c27\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" Dec 05 20:37:45 crc kubenswrapper[4885]: I1205 20:37:45.139190 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc279\" (UniqueName: \"kubernetes.io/projected/b27a1f4c-ba65-4b22-885a-e642064f7c27-kube-api-access-wc279\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk\" (UID: \"b27a1f4c-ba65-4b22-885a-e642064f7c27\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" Dec 05 20:37:45 crc kubenswrapper[4885]: I1205 20:37:45.144036 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b27a1f4c-ba65-4b22-885a-e642064f7c27-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk\" (UID: \"b27a1f4c-ba65-4b22-885a-e642064f7c27\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" Dec 05 20:37:45 crc kubenswrapper[4885]: I1205 20:37:45.144215 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b27a1f4c-ba65-4b22-885a-e642064f7c27-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk\" (UID: \"b27a1f4c-ba65-4b22-885a-e642064f7c27\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" Dec 05 20:37:45 crc kubenswrapper[4885]: I1205 20:37:45.161745 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc279\" (UniqueName: \"kubernetes.io/projected/b27a1f4c-ba65-4b22-885a-e642064f7c27-kube-api-access-wc279\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk\" (UID: \"b27a1f4c-ba65-4b22-885a-e642064f7c27\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" Dec 05 20:37:45 crc kubenswrapper[4885]: I1205 20:37:45.334371 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" Dec 05 20:37:45 crc kubenswrapper[4885]: I1205 20:37:45.839225 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk"] Dec 05 20:37:45 crc kubenswrapper[4885]: I1205 20:37:45.864129 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" event={"ID":"b27a1f4c-ba65-4b22-885a-e642064f7c27","Type":"ContainerStarted","Data":"680be54fafcecea758d68284f99d476d3b00a6a00c464789e00f81bb2af3954c"} Dec 05 20:37:46 crc kubenswrapper[4885]: I1205 20:37:46.881911 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" event={"ID":"b27a1f4c-ba65-4b22-885a-e642064f7c27","Type":"ContainerStarted","Data":"855b383dcdbf2908b4a29f57cfd9a1ba0fbe3eec0affda8812498c188d1876de"} Dec 05 20:37:46 crc kubenswrapper[4885]: I1205 20:37:46.908764 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" podStartSLOduration=2.505686055 podStartE2EDuration="2.908740962s" podCreationTimestamp="2025-12-05 20:37:44 +0000 UTC" firstStartedPulling="2025-12-05 20:37:45.847221161 +0000 UTC m=+1931.144036812" lastFinishedPulling="2025-12-05 20:37:46.250276058 +0000 UTC m=+1931.547091719" observedRunningTime="2025-12-05 20:37:46.897834091 +0000 UTC m=+1932.194649762" watchObservedRunningTime="2025-12-05 20:37:46.908740962 +0000 UTC m=+1932.205556613" Dec 05 20:37:55 crc kubenswrapper[4885]: I1205 20:37:55.317358 4885 scope.go:117] "RemoveContainer" containerID="7ffecd4c7a6cf04b6c732f34f34da32565af45ee5c40465741a1ce8297e28b53" Dec 05 20:37:56 crc kubenswrapper[4885]: I1205 20:37:56.974303 4885 generic.go:334] "Generic (PLEG): container finished" podID="b27a1f4c-ba65-4b22-885a-e642064f7c27" containerID="855b383dcdbf2908b4a29f57cfd9a1ba0fbe3eec0affda8812498c188d1876de" exitCode=0 Dec 05 20:37:56 crc kubenswrapper[4885]: I1205 20:37:56.974391 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" event={"ID":"b27a1f4c-ba65-4b22-885a-e642064f7c27","Type":"ContainerDied","Data":"855b383dcdbf2908b4a29f57cfd9a1ba0fbe3eec0affda8812498c188d1876de"} Dec 05 20:37:58 crc kubenswrapper[4885]: I1205 20:37:58.425101 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" Dec 05 20:37:58 crc kubenswrapper[4885]: I1205 20:37:58.551259 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc279\" (UniqueName: \"kubernetes.io/projected/b27a1f4c-ba65-4b22-885a-e642064f7c27-kube-api-access-wc279\") pod \"b27a1f4c-ba65-4b22-885a-e642064f7c27\" (UID: \"b27a1f4c-ba65-4b22-885a-e642064f7c27\") " Dec 05 20:37:58 crc kubenswrapper[4885]: I1205 20:37:58.551647 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b27a1f4c-ba65-4b22-885a-e642064f7c27-inventory\") pod \"b27a1f4c-ba65-4b22-885a-e642064f7c27\" (UID: \"b27a1f4c-ba65-4b22-885a-e642064f7c27\") " Dec 05 20:37:58 crc kubenswrapper[4885]: I1205 20:37:58.551744 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b27a1f4c-ba65-4b22-885a-e642064f7c27-ssh-key\") pod \"b27a1f4c-ba65-4b22-885a-e642064f7c27\" (UID: \"b27a1f4c-ba65-4b22-885a-e642064f7c27\") " Dec 05 20:37:58 crc kubenswrapper[4885]: I1205 20:37:58.558376 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b27a1f4c-ba65-4b22-885a-e642064f7c27-kube-api-access-wc279" (OuterVolumeSpecName: "kube-api-access-wc279") pod "b27a1f4c-ba65-4b22-885a-e642064f7c27" (UID: "b27a1f4c-ba65-4b22-885a-e642064f7c27"). InnerVolumeSpecName "kube-api-access-wc279". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:37:58 crc kubenswrapper[4885]: I1205 20:37:58.577128 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27a1f4c-ba65-4b22-885a-e642064f7c27-inventory" (OuterVolumeSpecName: "inventory") pod "b27a1f4c-ba65-4b22-885a-e642064f7c27" (UID: "b27a1f4c-ba65-4b22-885a-e642064f7c27"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:58 crc kubenswrapper[4885]: I1205 20:37:58.582799 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27a1f4c-ba65-4b22-885a-e642064f7c27-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b27a1f4c-ba65-4b22-885a-e642064f7c27" (UID: "b27a1f4c-ba65-4b22-885a-e642064f7c27"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:58 crc kubenswrapper[4885]: I1205 20:37:58.654776 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc279\" (UniqueName: \"kubernetes.io/projected/b27a1f4c-ba65-4b22-885a-e642064f7c27-kube-api-access-wc279\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:58 crc kubenswrapper[4885]: I1205 20:37:58.654838 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b27a1f4c-ba65-4b22-885a-e642064f7c27-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:58 crc kubenswrapper[4885]: I1205 20:37:58.654852 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b27a1f4c-ba65-4b22-885a-e642064f7c27-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:58 crc kubenswrapper[4885]: I1205 20:37:58.995545 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" event={"ID":"b27a1f4c-ba65-4b22-885a-e642064f7c27","Type":"ContainerDied","Data":"680be54fafcecea758d68284f99d476d3b00a6a00c464789e00f81bb2af3954c"} Dec 05 20:37:58 crc kubenswrapper[4885]: I1205 20:37:58.995579 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="680be54fafcecea758d68284f99d476d3b00a6a00c464789e00f81bb2af3954c" Dec 05 20:37:58 crc kubenswrapper[4885]: I1205 20:37:58.995614 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.099892 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh"] Dec 05 20:37:59 crc kubenswrapper[4885]: E1205 20:37:59.100357 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b27a1f4c-ba65-4b22-885a-e642064f7c27" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.100384 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27a1f4c-ba65-4b22-885a-e642064f7c27" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.100632 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b27a1f4c-ba65-4b22-885a-e642064f7c27" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.101320 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.104183 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.104674 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.105138 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.105168 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.105202 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.105347 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.105519 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.106567 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.118583 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh"] Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.265899 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.266637 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.266697 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.266875 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.267068 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.267121 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.267154 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.267207 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.267254 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.267364 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.267423 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkfd5\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-kube-api-access-kkfd5\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.267457 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.267603 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.267647 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.369081 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.369167 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.369224 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.369284 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.369319 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.369358 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.369396 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.369440 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.369512 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.369582 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.369630 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkfd5\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-kube-api-access-kkfd5\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.369670 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.369739 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.369770 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.375852 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.376078 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.376404 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.376850 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.377352 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.377666 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.377869 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.378185 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.378831 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.389084 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.389096 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.389380 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.389731 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.393832 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkfd5\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-kube-api-access-kkfd5\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pqprh\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:37:59 crc kubenswrapper[4885]: I1205 20:37:59.422039 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:38:00 crc kubenswrapper[4885]: I1205 20:38:00.134123 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh"] Dec 05 20:38:01 crc kubenswrapper[4885]: I1205 20:38:01.020301 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" event={"ID":"9c9ed39f-ee5e-4c66-8171-488ed01847db","Type":"ContainerStarted","Data":"f7ad9b46099d0eb7e5edb33730d5868f75016d68ff0563369e59c7dab30c70a8"} Dec 05 20:38:01 crc kubenswrapper[4885]: I1205 20:38:01.020680 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" event={"ID":"9c9ed39f-ee5e-4c66-8171-488ed01847db","Type":"ContainerStarted","Data":"f3e158de8eddb7bab59b108f9fbb9b3adbc9620a0b7fed08622f21f91643e045"} Dec 05 20:38:01 crc kubenswrapper[4885]: I1205 20:38:01.044937 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" podStartSLOduration=1.629860342 podStartE2EDuration="2.044914114s" podCreationTimestamp="2025-12-05 20:37:59 +0000 UTC" firstStartedPulling="2025-12-05 20:38:00.137413753 +0000 UTC m=+1945.434229414" lastFinishedPulling="2025-12-05 20:38:00.552467525 +0000 UTC m=+1945.849283186" observedRunningTime="2025-12-05 20:38:01.04448731 +0000 UTC m=+1946.341303031" watchObservedRunningTime="2025-12-05 20:38:01.044914114 +0000 UTC m=+1946.341729805" Dec 05 20:38:44 crc kubenswrapper[4885]: I1205 20:38:44.461833 4885 generic.go:334] "Generic (PLEG): container finished" podID="9c9ed39f-ee5e-4c66-8171-488ed01847db" containerID="f7ad9b46099d0eb7e5edb33730d5868f75016d68ff0563369e59c7dab30c70a8" exitCode=0 Dec 05 20:38:44 crc kubenswrapper[4885]: I1205 20:38:44.461884 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" event={"ID":"9c9ed39f-ee5e-4c66-8171-488ed01847db","Type":"ContainerDied","Data":"f7ad9b46099d0eb7e5edb33730d5868f75016d68ff0563369e59c7dab30c70a8"} Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.869421 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.961895 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-inventory\") pod \"9c9ed39f-ee5e-4c66-8171-488ed01847db\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.961988 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-nova-combined-ca-bundle\") pod \"9c9ed39f-ee5e-4c66-8171-488ed01847db\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.962133 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-libvirt-combined-ca-bundle\") pod \"9c9ed39f-ee5e-4c66-8171-488ed01847db\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.962199 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkfd5\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-kube-api-access-kkfd5\") pod \"9c9ed39f-ee5e-4c66-8171-488ed01847db\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.962243 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"9c9ed39f-ee5e-4c66-8171-488ed01847db\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.962278 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-ssh-key\") pod \"9c9ed39f-ee5e-4c66-8171-488ed01847db\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.962324 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-ovn-combined-ca-bundle\") pod \"9c9ed39f-ee5e-4c66-8171-488ed01847db\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.962367 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-telemetry-combined-ca-bundle\") pod \"9c9ed39f-ee5e-4c66-8171-488ed01847db\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.962482 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-repo-setup-combined-ca-bundle\") pod \"9c9ed39f-ee5e-4c66-8171-488ed01847db\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.962534 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-bootstrap-combined-ca-bundle\") pod \"9c9ed39f-ee5e-4c66-8171-488ed01847db\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.962616 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"9c9ed39f-ee5e-4c66-8171-488ed01847db\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.963614 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-ovn-default-certs-0\") pod \"9c9ed39f-ee5e-4c66-8171-488ed01847db\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.963672 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"9c9ed39f-ee5e-4c66-8171-488ed01847db\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.970086 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9c9ed39f-ee5e-4c66-8171-488ed01847db" (UID: "9c9ed39f-ee5e-4c66-8171-488ed01847db"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.970121 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9c9ed39f-ee5e-4c66-8171-488ed01847db" (UID: "9c9ed39f-ee5e-4c66-8171-488ed01847db"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.970264 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "9c9ed39f-ee5e-4c66-8171-488ed01847db" (UID: "9c9ed39f-ee5e-4c66-8171-488ed01847db"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.970877 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9c9ed39f-ee5e-4c66-8171-488ed01847db" (UID: "9c9ed39f-ee5e-4c66-8171-488ed01847db"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.972683 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "9c9ed39f-ee5e-4c66-8171-488ed01847db" (UID: "9c9ed39f-ee5e-4c66-8171-488ed01847db"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.972896 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9c9ed39f-ee5e-4c66-8171-488ed01847db" (UID: "9c9ed39f-ee5e-4c66-8171-488ed01847db"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.974805 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9c9ed39f-ee5e-4c66-8171-488ed01847db" (UID: "9c9ed39f-ee5e-4c66-8171-488ed01847db"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.974860 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "9c9ed39f-ee5e-4c66-8171-488ed01847db" (UID: "9c9ed39f-ee5e-4c66-8171-488ed01847db"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.975819 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "9c9ed39f-ee5e-4c66-8171-488ed01847db" (UID: "9c9ed39f-ee5e-4c66-8171-488ed01847db"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.977876 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9c9ed39f-ee5e-4c66-8171-488ed01847db" (UID: "9c9ed39f-ee5e-4c66-8171-488ed01847db"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:45 crc kubenswrapper[4885]: I1205 20:38:45.980472 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-kube-api-access-kkfd5" (OuterVolumeSpecName: "kube-api-access-kkfd5") pod "9c9ed39f-ee5e-4c66-8171-488ed01847db" (UID: "9c9ed39f-ee5e-4c66-8171-488ed01847db"). InnerVolumeSpecName "kube-api-access-kkfd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.009178 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9c9ed39f-ee5e-4c66-8171-488ed01847db" (UID: "9c9ed39f-ee5e-4c66-8171-488ed01847db"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.015736 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-inventory" (OuterVolumeSpecName: "inventory") pod "9c9ed39f-ee5e-4c66-8171-488ed01847db" (UID: "9c9ed39f-ee5e-4c66-8171-488ed01847db"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.067191 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-neutron-metadata-combined-ca-bundle\") pod \"9c9ed39f-ee5e-4c66-8171-488ed01847db\" (UID: \"9c9ed39f-ee5e-4c66-8171-488ed01847db\") " Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.068097 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.068133 4885 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.068155 4885 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.068176 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkfd5\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-kube-api-access-kkfd5\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.068196 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.068214 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.068232 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.068249 4885 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.068267 4885 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.068291 4885 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.068318 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.068345 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.068372 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c9ed39f-ee5e-4c66-8171-488ed01847db-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.070841 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9c9ed39f-ee5e-4c66-8171-488ed01847db" (UID: "9c9ed39f-ee5e-4c66-8171-488ed01847db"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.170547 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9ed39f-ee5e-4c66-8171-488ed01847db-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.485103 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" event={"ID":"9c9ed39f-ee5e-4c66-8171-488ed01847db","Type":"ContainerDied","Data":"f3e158de8eddb7bab59b108f9fbb9b3adbc9620a0b7fed08622f21f91643e045"} Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.485173 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3e158de8eddb7bab59b108f9fbb9b3adbc9620a0b7fed08622f21f91643e045" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.485171 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pqprh" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.631626 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.631921 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.634423 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j"] Dec 05 20:38:46 crc kubenswrapper[4885]: E1205 20:38:46.634908 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c9ed39f-ee5e-4c66-8171-488ed01847db" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.634935 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c9ed39f-ee5e-4c66-8171-488ed01847db" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.635344 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c9ed39f-ee5e-4c66-8171-488ed01847db" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.636174 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.639150 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.639229 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.642415 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.642520 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.642604 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.645985 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j"] Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.693812 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m548j\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.693913 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m548j\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.693947 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m548j\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.694003 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8ktx\" (UniqueName: \"kubernetes.io/projected/de5ebae2-9fe8-4b8a-ab85-60226fa56525-kube-api-access-s8ktx\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m548j\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.694038 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m548j\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.795438 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8ktx\" (UniqueName: \"kubernetes.io/projected/de5ebae2-9fe8-4b8a-ab85-60226fa56525-kube-api-access-s8ktx\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m548j\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.795697 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m548j\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.795833 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m548j\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.795957 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m548j\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.796103 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m548j\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.797274 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m548j\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.802229 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m548j\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.803821 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m548j\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.804235 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m548j\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.815590 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8ktx\" (UniqueName: \"kubernetes.io/projected/de5ebae2-9fe8-4b8a-ab85-60226fa56525-kube-api-access-s8ktx\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-m548j\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:46 crc kubenswrapper[4885]: I1205 20:38:46.964691 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:38:47 crc kubenswrapper[4885]: I1205 20:38:47.551329 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j"] Dec 05 20:38:48 crc kubenswrapper[4885]: I1205 20:38:48.512621 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" event={"ID":"de5ebae2-9fe8-4b8a-ab85-60226fa56525","Type":"ContainerStarted","Data":"b72f4638eed70460f14894a99025a4e2c275777d44f74dac335522a15f4f0094"} Dec 05 20:38:48 crc kubenswrapper[4885]: I1205 20:38:48.513333 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" event={"ID":"de5ebae2-9fe8-4b8a-ab85-60226fa56525","Type":"ContainerStarted","Data":"2019c527a9d473649fa7b46ef60b55f8a2158a8c1d7cc64a16e25dbb8af82e4e"} Dec 05 20:38:48 crc kubenswrapper[4885]: I1205 20:38:48.540946 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" podStartSLOduration=2.032044846 podStartE2EDuration="2.540919419s" podCreationTimestamp="2025-12-05 20:38:46 +0000 UTC" firstStartedPulling="2025-12-05 20:38:47.548399303 +0000 UTC m=+1992.845214984" lastFinishedPulling="2025-12-05 20:38:48.057273896 +0000 UTC m=+1993.354089557" observedRunningTime="2025-12-05 20:38:48.532872938 +0000 UTC m=+1993.829688609" watchObservedRunningTime="2025-12-05 20:38:48.540919419 +0000 UTC m=+1993.837735100" Dec 05 20:39:16 crc kubenswrapper[4885]: I1205 20:39:16.631550 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:39:16 crc kubenswrapper[4885]: I1205 20:39:16.632107 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:39:46 crc kubenswrapper[4885]: I1205 20:39:46.630616 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:39:46 crc kubenswrapper[4885]: I1205 20:39:46.631430 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:39:46 crc kubenswrapper[4885]: I1205 20:39:46.631495 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:39:46 crc kubenswrapper[4885]: I1205 20:39:46.632384 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98812e64a2f367ecc2033031c0d3a29d3f95a9bab3a69de50ab7fd7e937cb70a"} pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:39:46 crc kubenswrapper[4885]: I1205 20:39:46.632459 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" containerID="cri-o://98812e64a2f367ecc2033031c0d3a29d3f95a9bab3a69de50ab7fd7e937cb70a" gracePeriod=600 Dec 05 20:39:47 crc kubenswrapper[4885]: I1205 20:39:47.189383 4885 generic.go:334] "Generic (PLEG): container finished" podID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerID="98812e64a2f367ecc2033031c0d3a29d3f95a9bab3a69de50ab7fd7e937cb70a" exitCode=0 Dec 05 20:39:47 crc kubenswrapper[4885]: I1205 20:39:47.193399 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerDied","Data":"98812e64a2f367ecc2033031c0d3a29d3f95a9bab3a69de50ab7fd7e937cb70a"} Dec 05 20:39:47 crc kubenswrapper[4885]: I1205 20:39:47.193450 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3"} Dec 05 20:39:47 crc kubenswrapper[4885]: I1205 20:39:47.193470 4885 scope.go:117] "RemoveContainer" containerID="00e8fbd8f103b858dc77f8ff79a79794d59cd98642165400e751edb85deac4ba" Dec 05 20:39:56 crc kubenswrapper[4885]: I1205 20:39:56.161869 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xhpvd"] Dec 05 20:39:56 crc kubenswrapper[4885]: I1205 20:39:56.165003 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:39:56 crc kubenswrapper[4885]: I1205 20:39:56.171898 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xhpvd"] Dec 05 20:39:56 crc kubenswrapper[4885]: I1205 20:39:56.259311 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e70e6729-f4f9-49b0-9b10-113689e2b2cb-catalog-content\") pod \"redhat-operators-xhpvd\" (UID: \"e70e6729-f4f9-49b0-9b10-113689e2b2cb\") " pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:39:56 crc kubenswrapper[4885]: I1205 20:39:56.259692 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57d8k\" (UniqueName: \"kubernetes.io/projected/e70e6729-f4f9-49b0-9b10-113689e2b2cb-kube-api-access-57d8k\") pod \"redhat-operators-xhpvd\" (UID: \"e70e6729-f4f9-49b0-9b10-113689e2b2cb\") " pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:39:56 crc kubenswrapper[4885]: I1205 20:39:56.259755 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e70e6729-f4f9-49b0-9b10-113689e2b2cb-utilities\") pod \"redhat-operators-xhpvd\" (UID: \"e70e6729-f4f9-49b0-9b10-113689e2b2cb\") " pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:39:56 crc kubenswrapper[4885]: I1205 20:39:56.360936 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e70e6729-f4f9-49b0-9b10-113689e2b2cb-catalog-content\") pod \"redhat-operators-xhpvd\" (UID: \"e70e6729-f4f9-49b0-9b10-113689e2b2cb\") " pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:39:56 crc kubenswrapper[4885]: I1205 20:39:56.361055 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57d8k\" (UniqueName: \"kubernetes.io/projected/e70e6729-f4f9-49b0-9b10-113689e2b2cb-kube-api-access-57d8k\") pod \"redhat-operators-xhpvd\" (UID: \"e70e6729-f4f9-49b0-9b10-113689e2b2cb\") " pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:39:56 crc kubenswrapper[4885]: I1205 20:39:56.361115 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e70e6729-f4f9-49b0-9b10-113689e2b2cb-utilities\") pod \"redhat-operators-xhpvd\" (UID: \"e70e6729-f4f9-49b0-9b10-113689e2b2cb\") " pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:39:56 crc kubenswrapper[4885]: I1205 20:39:56.361603 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e70e6729-f4f9-49b0-9b10-113689e2b2cb-catalog-content\") pod \"redhat-operators-xhpvd\" (UID: \"e70e6729-f4f9-49b0-9b10-113689e2b2cb\") " pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:39:56 crc kubenswrapper[4885]: I1205 20:39:56.361653 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e70e6729-f4f9-49b0-9b10-113689e2b2cb-utilities\") pod \"redhat-operators-xhpvd\" (UID: \"e70e6729-f4f9-49b0-9b10-113689e2b2cb\") " pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:39:56 crc kubenswrapper[4885]: I1205 20:39:56.383170 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57d8k\" (UniqueName: \"kubernetes.io/projected/e70e6729-f4f9-49b0-9b10-113689e2b2cb-kube-api-access-57d8k\") pod \"redhat-operators-xhpvd\" (UID: \"e70e6729-f4f9-49b0-9b10-113689e2b2cb\") " pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:39:56 crc kubenswrapper[4885]: I1205 20:39:56.496721 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:39:56 crc kubenswrapper[4885]: I1205 20:39:56.953922 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xhpvd"] Dec 05 20:39:57 crc kubenswrapper[4885]: I1205 20:39:57.303419 4885 generic.go:334] "Generic (PLEG): container finished" podID="e70e6729-f4f9-49b0-9b10-113689e2b2cb" containerID="413740a8bf321f3190702122825ce9032247eb488e72df9565bbe234f8dff830" exitCode=0 Dec 05 20:39:57 crc kubenswrapper[4885]: I1205 20:39:57.303484 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhpvd" event={"ID":"e70e6729-f4f9-49b0-9b10-113689e2b2cb","Type":"ContainerDied","Data":"413740a8bf321f3190702122825ce9032247eb488e72df9565bbe234f8dff830"} Dec 05 20:39:57 crc kubenswrapper[4885]: I1205 20:39:57.303509 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhpvd" event={"ID":"e70e6729-f4f9-49b0-9b10-113689e2b2cb","Type":"ContainerStarted","Data":"7bb1f45795f509122a33b572ceb7ec6289b210df0ea230b260a68e69b2d4e89a"} Dec 05 20:39:57 crc kubenswrapper[4885]: I1205 20:39:57.305392 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:39:57 crc kubenswrapper[4885]: I1205 20:39:57.306070 4885 generic.go:334] "Generic (PLEG): container finished" podID="de5ebae2-9fe8-4b8a-ab85-60226fa56525" containerID="b72f4638eed70460f14894a99025a4e2c275777d44f74dac335522a15f4f0094" exitCode=0 Dec 05 20:39:57 crc kubenswrapper[4885]: I1205 20:39:57.306094 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" event={"ID":"de5ebae2-9fe8-4b8a-ab85-60226fa56525","Type":"ContainerDied","Data":"b72f4638eed70460f14894a99025a4e2c275777d44f74dac335522a15f4f0094"} Dec 05 20:39:58 crc kubenswrapper[4885]: I1205 20:39:58.315831 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhpvd" event={"ID":"e70e6729-f4f9-49b0-9b10-113689e2b2cb","Type":"ContainerStarted","Data":"9f955bb24dbb1ddb05c4b87cb249c3b906742dbb4c57a377e60c4bd52afc855b"} Dec 05 20:39:58 crc kubenswrapper[4885]: I1205 20:39:58.767352 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:39:58 crc kubenswrapper[4885]: I1205 20:39:58.913490 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-inventory\") pod \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " Dec 05 20:39:58 crc kubenswrapper[4885]: I1205 20:39:58.913647 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8ktx\" (UniqueName: \"kubernetes.io/projected/de5ebae2-9fe8-4b8a-ab85-60226fa56525-kube-api-access-s8ktx\") pod \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " Dec 05 20:39:58 crc kubenswrapper[4885]: I1205 20:39:58.913728 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ovn-combined-ca-bundle\") pod \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " Dec 05 20:39:58 crc kubenswrapper[4885]: I1205 20:39:58.913779 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ssh-key\") pod \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " Dec 05 20:39:58 crc kubenswrapper[4885]: I1205 20:39:58.913919 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ovncontroller-config-0\") pod \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\" (UID: \"de5ebae2-9fe8-4b8a-ab85-60226fa56525\") " Dec 05 20:39:58 crc kubenswrapper[4885]: I1205 20:39:58.919450 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de5ebae2-9fe8-4b8a-ab85-60226fa56525-kube-api-access-s8ktx" (OuterVolumeSpecName: "kube-api-access-s8ktx") pod "de5ebae2-9fe8-4b8a-ab85-60226fa56525" (UID: "de5ebae2-9fe8-4b8a-ab85-60226fa56525"). InnerVolumeSpecName "kube-api-access-s8ktx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:39:58 crc kubenswrapper[4885]: I1205 20:39:58.920316 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "de5ebae2-9fe8-4b8a-ab85-60226fa56525" (UID: "de5ebae2-9fe8-4b8a-ab85-60226fa56525"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:58 crc kubenswrapper[4885]: I1205 20:39:58.939266 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "de5ebae2-9fe8-4b8a-ab85-60226fa56525" (UID: "de5ebae2-9fe8-4b8a-ab85-60226fa56525"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:39:58 crc kubenswrapper[4885]: I1205 20:39:58.940417 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "de5ebae2-9fe8-4b8a-ab85-60226fa56525" (UID: "de5ebae2-9fe8-4b8a-ab85-60226fa56525"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:58 crc kubenswrapper[4885]: I1205 20:39:58.952141 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-inventory" (OuterVolumeSpecName: "inventory") pod "de5ebae2-9fe8-4b8a-ab85-60226fa56525" (UID: "de5ebae2-9fe8-4b8a-ab85-60226fa56525"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.016097 4885 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.016140 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.016154 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8ktx\" (UniqueName: \"kubernetes.io/projected/de5ebae2-9fe8-4b8a-ab85-60226fa56525-kube-api-access-s8ktx\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.016172 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.016183 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de5ebae2-9fe8-4b8a-ab85-60226fa56525-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.324259 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.325096 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-m548j" event={"ID":"de5ebae2-9fe8-4b8a-ab85-60226fa56525","Type":"ContainerDied","Data":"2019c527a9d473649fa7b46ef60b55f8a2158a8c1d7cc64a16e25dbb8af82e4e"} Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.325128 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2019c527a9d473649fa7b46ef60b55f8a2158a8c1d7cc64a16e25dbb8af82e4e" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.327954 4885 generic.go:334] "Generic (PLEG): container finished" podID="e70e6729-f4f9-49b0-9b10-113689e2b2cb" containerID="9f955bb24dbb1ddb05c4b87cb249c3b906742dbb4c57a377e60c4bd52afc855b" exitCode=0 Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.327985 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhpvd" event={"ID":"e70e6729-f4f9-49b0-9b10-113689e2b2cb","Type":"ContainerDied","Data":"9f955bb24dbb1ddb05c4b87cb249c3b906742dbb4c57a377e60c4bd52afc855b"} Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.421938 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d"] Dec 05 20:39:59 crc kubenswrapper[4885]: E1205 20:39:59.422771 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5ebae2-9fe8-4b8a-ab85-60226fa56525" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.422903 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5ebae2-9fe8-4b8a-ab85-60226fa56525" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.423339 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="de5ebae2-9fe8-4b8a-ab85-60226fa56525" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.424526 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.427675 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.427932 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.428067 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.428188 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.428284 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.428285 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.432279 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d"] Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.526215 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.526619 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.526785 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.526916 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.527108 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pq28\" (UniqueName: \"kubernetes.io/projected/525d9ebb-07fb-41b7-9059-d609ed9cac0e-kube-api-access-4pq28\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.527292 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.630178 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.630521 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.630572 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.630649 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.630689 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.630741 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pq28\" (UniqueName: \"kubernetes.io/projected/525d9ebb-07fb-41b7-9059-d609ed9cac0e-kube-api-access-4pq28\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.637250 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.637333 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.641171 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.649610 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.649937 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.651852 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pq28\" (UniqueName: \"kubernetes.io/projected/525d9ebb-07fb-41b7-9059-d609ed9cac0e-kube-api-access-4pq28\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:39:59 crc kubenswrapper[4885]: I1205 20:39:59.746947 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:40:00 crc kubenswrapper[4885]: I1205 20:40:00.266261 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d"] Dec 05 20:40:00 crc kubenswrapper[4885]: W1205 20:40:00.280012 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod525d9ebb_07fb_41b7_9059_d609ed9cac0e.slice/crio-ff1f8e63495ece8575cb3bcdbc5468f36237c1e4e81089011d9e10dfa1621687 WatchSource:0}: Error finding container ff1f8e63495ece8575cb3bcdbc5468f36237c1e4e81089011d9e10dfa1621687: Status 404 returned error can't find the container with id ff1f8e63495ece8575cb3bcdbc5468f36237c1e4e81089011d9e10dfa1621687 Dec 05 20:40:00 crc kubenswrapper[4885]: I1205 20:40:00.337506 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" event={"ID":"525d9ebb-07fb-41b7-9059-d609ed9cac0e","Type":"ContainerStarted","Data":"ff1f8e63495ece8575cb3bcdbc5468f36237c1e4e81089011d9e10dfa1621687"} Dec 05 20:40:00 crc kubenswrapper[4885]: I1205 20:40:00.339926 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhpvd" event={"ID":"e70e6729-f4f9-49b0-9b10-113689e2b2cb","Type":"ContainerStarted","Data":"831b423e397b63d276e6df6de9dd0e49494a6dca198b4188c555d37d482cc69f"} Dec 05 20:40:01 crc kubenswrapper[4885]: I1205 20:40:01.353193 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" event={"ID":"525d9ebb-07fb-41b7-9059-d609ed9cac0e","Type":"ContainerStarted","Data":"1bb09b68472b5ee2c25943db54cd2032668ce54aeb91172179f8629b96efe8ef"} Dec 05 20:40:01 crc kubenswrapper[4885]: I1205 20:40:01.378163 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xhpvd" podStartSLOduration=2.8573603690000002 podStartE2EDuration="5.378143472s" podCreationTimestamp="2025-12-05 20:39:56 +0000 UTC" firstStartedPulling="2025-12-05 20:39:57.305190125 +0000 UTC m=+2062.602005786" lastFinishedPulling="2025-12-05 20:39:59.825973228 +0000 UTC m=+2065.122788889" observedRunningTime="2025-12-05 20:40:00.356231798 +0000 UTC m=+2065.653047459" watchObservedRunningTime="2025-12-05 20:40:01.378143472 +0000 UTC m=+2066.674959133" Dec 05 20:40:01 crc kubenswrapper[4885]: I1205 20:40:01.381156 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" podStartSLOduration=1.671138424 podStartE2EDuration="2.381146736s" podCreationTimestamp="2025-12-05 20:39:59 +0000 UTC" firstStartedPulling="2025-12-05 20:40:00.28358821 +0000 UTC m=+2065.580403871" lastFinishedPulling="2025-12-05 20:40:00.993596522 +0000 UTC m=+2066.290412183" observedRunningTime="2025-12-05 20:40:01.376717977 +0000 UTC m=+2066.673533678" watchObservedRunningTime="2025-12-05 20:40:01.381146736 +0000 UTC m=+2066.677962397" Dec 05 20:40:06 crc kubenswrapper[4885]: I1205 20:40:06.497839 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:40:06 crc kubenswrapper[4885]: I1205 20:40:06.498425 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:40:06 crc kubenswrapper[4885]: I1205 20:40:06.541362 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:40:07 crc kubenswrapper[4885]: I1205 20:40:07.461477 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:40:07 crc kubenswrapper[4885]: I1205 20:40:07.511430 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xhpvd"] Dec 05 20:40:09 crc kubenswrapper[4885]: I1205 20:40:09.419184 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xhpvd" podUID="e70e6729-f4f9-49b0-9b10-113689e2b2cb" containerName="registry-server" containerID="cri-o://831b423e397b63d276e6df6de9dd0e49494a6dca198b4188c555d37d482cc69f" gracePeriod=2 Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.339351 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.430457 4885 generic.go:334] "Generic (PLEG): container finished" podID="e70e6729-f4f9-49b0-9b10-113689e2b2cb" containerID="831b423e397b63d276e6df6de9dd0e49494a6dca198b4188c555d37d482cc69f" exitCode=0 Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.430502 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhpvd" event={"ID":"e70e6729-f4f9-49b0-9b10-113689e2b2cb","Type":"ContainerDied","Data":"831b423e397b63d276e6df6de9dd0e49494a6dca198b4188c555d37d482cc69f"} Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.430550 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhpvd" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.430581 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhpvd" event={"ID":"e70e6729-f4f9-49b0-9b10-113689e2b2cb","Type":"ContainerDied","Data":"7bb1f45795f509122a33b572ceb7ec6289b210df0ea230b260a68e69b2d4e89a"} Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.430605 4885 scope.go:117] "RemoveContainer" containerID="831b423e397b63d276e6df6de9dd0e49494a6dca198b4188c555d37d482cc69f" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.452064 4885 scope.go:117] "RemoveContainer" containerID="9f955bb24dbb1ddb05c4b87cb249c3b906742dbb4c57a377e60c4bd52afc855b" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.454095 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e70e6729-f4f9-49b0-9b10-113689e2b2cb-catalog-content\") pod \"e70e6729-f4f9-49b0-9b10-113689e2b2cb\" (UID: \"e70e6729-f4f9-49b0-9b10-113689e2b2cb\") " Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.454259 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57d8k\" (UniqueName: \"kubernetes.io/projected/e70e6729-f4f9-49b0-9b10-113689e2b2cb-kube-api-access-57d8k\") pod \"e70e6729-f4f9-49b0-9b10-113689e2b2cb\" (UID: \"e70e6729-f4f9-49b0-9b10-113689e2b2cb\") " Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.454415 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e70e6729-f4f9-49b0-9b10-113689e2b2cb-utilities\") pod \"e70e6729-f4f9-49b0-9b10-113689e2b2cb\" (UID: \"e70e6729-f4f9-49b0-9b10-113689e2b2cb\") " Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.455547 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e70e6729-f4f9-49b0-9b10-113689e2b2cb-utilities" (OuterVolumeSpecName: "utilities") pod "e70e6729-f4f9-49b0-9b10-113689e2b2cb" (UID: "e70e6729-f4f9-49b0-9b10-113689e2b2cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.476555 4885 scope.go:117] "RemoveContainer" containerID="413740a8bf321f3190702122825ce9032247eb488e72df9565bbe234f8dff830" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.478378 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e70e6729-f4f9-49b0-9b10-113689e2b2cb-kube-api-access-57d8k" (OuterVolumeSpecName: "kube-api-access-57d8k") pod "e70e6729-f4f9-49b0-9b10-113689e2b2cb" (UID: "e70e6729-f4f9-49b0-9b10-113689e2b2cb"). InnerVolumeSpecName "kube-api-access-57d8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.556322 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57d8k\" (UniqueName: \"kubernetes.io/projected/e70e6729-f4f9-49b0-9b10-113689e2b2cb-kube-api-access-57d8k\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.556359 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e70e6729-f4f9-49b0-9b10-113689e2b2cb-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.568638 4885 scope.go:117] "RemoveContainer" containerID="831b423e397b63d276e6df6de9dd0e49494a6dca198b4188c555d37d482cc69f" Dec 05 20:40:10 crc kubenswrapper[4885]: E1205 20:40:10.569936 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"831b423e397b63d276e6df6de9dd0e49494a6dca198b4188c555d37d482cc69f\": container with ID starting with 831b423e397b63d276e6df6de9dd0e49494a6dca198b4188c555d37d482cc69f not found: ID does not exist" containerID="831b423e397b63d276e6df6de9dd0e49494a6dca198b4188c555d37d482cc69f" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.569982 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"831b423e397b63d276e6df6de9dd0e49494a6dca198b4188c555d37d482cc69f"} err="failed to get container status \"831b423e397b63d276e6df6de9dd0e49494a6dca198b4188c555d37d482cc69f\": rpc error: code = NotFound desc = could not find container \"831b423e397b63d276e6df6de9dd0e49494a6dca198b4188c555d37d482cc69f\": container with ID starting with 831b423e397b63d276e6df6de9dd0e49494a6dca198b4188c555d37d482cc69f not found: ID does not exist" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.570007 4885 scope.go:117] "RemoveContainer" containerID="9f955bb24dbb1ddb05c4b87cb249c3b906742dbb4c57a377e60c4bd52afc855b" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.570873 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e70e6729-f4f9-49b0-9b10-113689e2b2cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e70e6729-f4f9-49b0-9b10-113689e2b2cb" (UID: "e70e6729-f4f9-49b0-9b10-113689e2b2cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:40:10 crc kubenswrapper[4885]: E1205 20:40:10.571347 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f955bb24dbb1ddb05c4b87cb249c3b906742dbb4c57a377e60c4bd52afc855b\": container with ID starting with 9f955bb24dbb1ddb05c4b87cb249c3b906742dbb4c57a377e60c4bd52afc855b not found: ID does not exist" containerID="9f955bb24dbb1ddb05c4b87cb249c3b906742dbb4c57a377e60c4bd52afc855b" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.571377 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f955bb24dbb1ddb05c4b87cb249c3b906742dbb4c57a377e60c4bd52afc855b"} err="failed to get container status \"9f955bb24dbb1ddb05c4b87cb249c3b906742dbb4c57a377e60c4bd52afc855b\": rpc error: code = NotFound desc = could not find container \"9f955bb24dbb1ddb05c4b87cb249c3b906742dbb4c57a377e60c4bd52afc855b\": container with ID starting with 9f955bb24dbb1ddb05c4b87cb249c3b906742dbb4c57a377e60c4bd52afc855b not found: ID does not exist" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.571396 4885 scope.go:117] "RemoveContainer" containerID="413740a8bf321f3190702122825ce9032247eb488e72df9565bbe234f8dff830" Dec 05 20:40:10 crc kubenswrapper[4885]: E1205 20:40:10.571656 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"413740a8bf321f3190702122825ce9032247eb488e72df9565bbe234f8dff830\": container with ID starting with 413740a8bf321f3190702122825ce9032247eb488e72df9565bbe234f8dff830 not found: ID does not exist" containerID="413740a8bf321f3190702122825ce9032247eb488e72df9565bbe234f8dff830" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.571677 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"413740a8bf321f3190702122825ce9032247eb488e72df9565bbe234f8dff830"} err="failed to get container status \"413740a8bf321f3190702122825ce9032247eb488e72df9565bbe234f8dff830\": rpc error: code = NotFound desc = could not find container \"413740a8bf321f3190702122825ce9032247eb488e72df9565bbe234f8dff830\": container with ID starting with 413740a8bf321f3190702122825ce9032247eb488e72df9565bbe234f8dff830 not found: ID does not exist" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.658695 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e70e6729-f4f9-49b0-9b10-113689e2b2cb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.762902 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xhpvd"] Dec 05 20:40:10 crc kubenswrapper[4885]: I1205 20:40:10.769970 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xhpvd"] Dec 05 20:40:11 crc kubenswrapper[4885]: I1205 20:40:11.198595 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e70e6729-f4f9-49b0-9b10-113689e2b2cb" path="/var/lib/kubelet/pods/e70e6729-f4f9-49b0-9b10-113689e2b2cb/volumes" Dec 05 20:40:53 crc kubenswrapper[4885]: I1205 20:40:53.859559 4885 generic.go:334] "Generic (PLEG): container finished" podID="525d9ebb-07fb-41b7-9059-d609ed9cac0e" containerID="1bb09b68472b5ee2c25943db54cd2032668ce54aeb91172179f8629b96efe8ef" exitCode=0 Dec 05 20:40:53 crc kubenswrapper[4885]: I1205 20:40:53.859617 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" event={"ID":"525d9ebb-07fb-41b7-9059-d609ed9cac0e","Type":"ContainerDied","Data":"1bb09b68472b5ee2c25943db54cd2032668ce54aeb91172179f8629b96efe8ef"} Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.307714 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.350472 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.352071 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pq28\" (UniqueName: \"kubernetes.io/projected/525d9ebb-07fb-41b7-9059-d609ed9cac0e-kube-api-access-4pq28\") pod \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.352133 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-nova-metadata-neutron-config-0\") pod \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.357790 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/525d9ebb-07fb-41b7-9059-d609ed9cac0e-kube-api-access-4pq28" (OuterVolumeSpecName: "kube-api-access-4pq28") pod "525d9ebb-07fb-41b7-9059-d609ed9cac0e" (UID: "525d9ebb-07fb-41b7-9059-d609ed9cac0e"). InnerVolumeSpecName "kube-api-access-4pq28". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.385852 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "525d9ebb-07fb-41b7-9059-d609ed9cac0e" (UID: "525d9ebb-07fb-41b7-9059-d609ed9cac0e"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.387274 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "525d9ebb-07fb-41b7-9059-d609ed9cac0e" (UID: "525d9ebb-07fb-41b7-9059-d609ed9cac0e"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.454028 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-neutron-metadata-combined-ca-bundle\") pod \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.454091 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-ssh-key\") pod \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.454136 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-inventory\") pod \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\" (UID: \"525d9ebb-07fb-41b7-9059-d609ed9cac0e\") " Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.454410 4885 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.454429 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.454439 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pq28\" (UniqueName: \"kubernetes.io/projected/525d9ebb-07fb-41b7-9059-d609ed9cac0e-kube-api-access-4pq28\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.458246 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "525d9ebb-07fb-41b7-9059-d609ed9cac0e" (UID: "525d9ebb-07fb-41b7-9059-d609ed9cac0e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.484792 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "525d9ebb-07fb-41b7-9059-d609ed9cac0e" (UID: "525d9ebb-07fb-41b7-9059-d609ed9cac0e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.500393 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-inventory" (OuterVolumeSpecName: "inventory") pod "525d9ebb-07fb-41b7-9059-d609ed9cac0e" (UID: "525d9ebb-07fb-41b7-9059-d609ed9cac0e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.557299 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.557353 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.557367 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/525d9ebb-07fb-41b7-9059-d609ed9cac0e-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.880038 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" event={"ID":"525d9ebb-07fb-41b7-9059-d609ed9cac0e","Type":"ContainerDied","Data":"ff1f8e63495ece8575cb3bcdbc5468f36237c1e4e81089011d9e10dfa1621687"} Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.880079 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff1f8e63495ece8575cb3bcdbc5468f36237c1e4e81089011d9e10dfa1621687" Dec 05 20:40:55 crc kubenswrapper[4885]: I1205 20:40:55.880113 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.050226 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt"] Dec 05 20:40:56 crc kubenswrapper[4885]: E1205 20:40:56.050581 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="525d9ebb-07fb-41b7-9059-d609ed9cac0e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.050599 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="525d9ebb-07fb-41b7-9059-d609ed9cac0e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 20:40:56 crc kubenswrapper[4885]: E1205 20:40:56.050621 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70e6729-f4f9-49b0-9b10-113689e2b2cb" containerName="extract-content" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.050628 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70e6729-f4f9-49b0-9b10-113689e2b2cb" containerName="extract-content" Dec 05 20:40:56 crc kubenswrapper[4885]: E1205 20:40:56.050658 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70e6729-f4f9-49b0-9b10-113689e2b2cb" containerName="extract-utilities" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.050665 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70e6729-f4f9-49b0-9b10-113689e2b2cb" containerName="extract-utilities" Dec 05 20:40:56 crc kubenswrapper[4885]: E1205 20:40:56.050680 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70e6729-f4f9-49b0-9b10-113689e2b2cb" containerName="registry-server" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.050685 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70e6729-f4f9-49b0-9b10-113689e2b2cb" containerName="registry-server" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.050861 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="525d9ebb-07fb-41b7-9059-d609ed9cac0e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.050878 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e70e6729-f4f9-49b0-9b10-113689e2b2cb" containerName="registry-server" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.051531 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.053553 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.053707 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.054261 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.054510 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.056421 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.068874 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt"] Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.166388 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.166503 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.166543 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.166599 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.166629 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftb25\" (UniqueName: \"kubernetes.io/projected/7b51c87e-b603-43e2-bb06-a8e9a0416a59-kube-api-access-ftb25\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.268502 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.268615 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftb25\" (UniqueName: \"kubernetes.io/projected/7b51c87e-b603-43e2-bb06-a8e9a0416a59-kube-api-access-ftb25\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.268743 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.268970 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.269119 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.273770 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.273904 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.277180 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.278789 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.289196 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftb25\" (UniqueName: \"kubernetes.io/projected/7b51c87e-b603-43e2-bb06-a8e9a0416a59-kube-api-access-ftb25\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.369299 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:40:56 crc kubenswrapper[4885]: I1205 20:40:56.915528 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt"] Dec 05 20:40:57 crc kubenswrapper[4885]: I1205 20:40:57.900632 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" event={"ID":"7b51c87e-b603-43e2-bb06-a8e9a0416a59","Type":"ContainerStarted","Data":"409d7cd28f7e50531da0d00718de82e8a6e3c396c3b8d420b2243af08dbaccec"} Dec 05 20:40:57 crc kubenswrapper[4885]: I1205 20:40:57.901041 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" event={"ID":"7b51c87e-b603-43e2-bb06-a8e9a0416a59","Type":"ContainerStarted","Data":"77b7631a1abece3f76dd750ea705de00ea6713af7798024593cd9513e0097448"} Dec 05 20:40:57 crc kubenswrapper[4885]: I1205 20:40:57.925656 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" podStartSLOduration=1.417859653 podStartE2EDuration="1.925632751s" podCreationTimestamp="2025-12-05 20:40:56 +0000 UTC" firstStartedPulling="2025-12-05 20:40:56.916555258 +0000 UTC m=+2122.213370929" lastFinishedPulling="2025-12-05 20:40:57.424328326 +0000 UTC m=+2122.721144027" observedRunningTime="2025-12-05 20:40:57.920562873 +0000 UTC m=+2123.217378564" watchObservedRunningTime="2025-12-05 20:40:57.925632751 +0000 UTC m=+2123.222448422" Dec 05 20:41:46 crc kubenswrapper[4885]: I1205 20:41:46.630795 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:41:46 crc kubenswrapper[4885]: I1205 20:41:46.631190 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:42:16 crc kubenswrapper[4885]: I1205 20:42:16.631453 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:42:16 crc kubenswrapper[4885]: I1205 20:42:16.631982 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:42:46 crc kubenswrapper[4885]: I1205 20:42:46.631194 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:42:46 crc kubenswrapper[4885]: I1205 20:42:46.631855 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:42:46 crc kubenswrapper[4885]: I1205 20:42:46.631905 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:42:46 crc kubenswrapper[4885]: I1205 20:42:46.632851 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3"} pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:42:46 crc kubenswrapper[4885]: I1205 20:42:46.632922 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" containerID="cri-o://390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" gracePeriod=600 Dec 05 20:42:46 crc kubenswrapper[4885]: E1205 20:42:46.767382 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:42:47 crc kubenswrapper[4885]: I1205 20:42:47.013942 4885 generic.go:334] "Generic (PLEG): container finished" podID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" exitCode=0 Dec 05 20:42:47 crc kubenswrapper[4885]: I1205 20:42:47.013990 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerDied","Data":"390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3"} Dec 05 20:42:47 crc kubenswrapper[4885]: I1205 20:42:47.014048 4885 scope.go:117] "RemoveContainer" containerID="98812e64a2f367ecc2033031c0d3a29d3f95a9bab3a69de50ab7fd7e937cb70a" Dec 05 20:42:47 crc kubenswrapper[4885]: I1205 20:42:47.014731 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:42:47 crc kubenswrapper[4885]: E1205 20:42:47.015003 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:43:02 crc kubenswrapper[4885]: I1205 20:43:02.172628 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:43:02 crc kubenswrapper[4885]: E1205 20:43:02.173372 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:43:15 crc kubenswrapper[4885]: I1205 20:43:15.182750 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:43:15 crc kubenswrapper[4885]: E1205 20:43:15.183531 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:43:30 crc kubenswrapper[4885]: I1205 20:43:30.172474 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:43:30 crc kubenswrapper[4885]: E1205 20:43:30.174199 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:43:41 crc kubenswrapper[4885]: I1205 20:43:41.173716 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:43:41 crc kubenswrapper[4885]: E1205 20:43:41.174623 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:43:53 crc kubenswrapper[4885]: I1205 20:43:53.173238 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:43:53 crc kubenswrapper[4885]: E1205 20:43:53.173999 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:44:05 crc kubenswrapper[4885]: I1205 20:44:05.180152 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:44:05 crc kubenswrapper[4885]: E1205 20:44:05.181695 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:44:17 crc kubenswrapper[4885]: I1205 20:44:17.173392 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:44:17 crc kubenswrapper[4885]: E1205 20:44:17.174309 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:44:29 crc kubenswrapper[4885]: I1205 20:44:29.173907 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:44:29 crc kubenswrapper[4885]: E1205 20:44:29.174990 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:44:41 crc kubenswrapper[4885]: I1205 20:44:41.173381 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:44:41 crc kubenswrapper[4885]: E1205 20:44:41.174548 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:44:56 crc kubenswrapper[4885]: I1205 20:44:56.172768 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:44:56 crc kubenswrapper[4885]: E1205 20:44:56.173646 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:45:00 crc kubenswrapper[4885]: I1205 20:45:00.156771 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n"] Dec 05 20:45:00 crc kubenswrapper[4885]: I1205 20:45:00.159411 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n" Dec 05 20:45:00 crc kubenswrapper[4885]: I1205 20:45:00.164659 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 20:45:00 crc kubenswrapper[4885]: I1205 20:45:00.167713 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dstx9\" (UniqueName: \"kubernetes.io/projected/799f2d01-917a-4b4d-a870-c364418a5802-kube-api-access-dstx9\") pod \"collect-profiles-29416125-tsn2n\" (UID: \"799f2d01-917a-4b4d-a870-c364418a5802\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n" Dec 05 20:45:00 crc kubenswrapper[4885]: I1205 20:45:00.167890 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/799f2d01-917a-4b4d-a870-c364418a5802-config-volume\") pod \"collect-profiles-29416125-tsn2n\" (UID: \"799f2d01-917a-4b4d-a870-c364418a5802\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n" Dec 05 20:45:00 crc kubenswrapper[4885]: I1205 20:45:00.168148 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/799f2d01-917a-4b4d-a870-c364418a5802-secret-volume\") pod \"collect-profiles-29416125-tsn2n\" (UID: \"799f2d01-917a-4b4d-a870-c364418a5802\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n" Dec 05 20:45:00 crc kubenswrapper[4885]: I1205 20:45:00.173453 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n"] Dec 05 20:45:00 crc kubenswrapper[4885]: I1205 20:45:00.173626 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 20:45:00 crc kubenswrapper[4885]: I1205 20:45:00.270344 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dstx9\" (UniqueName: \"kubernetes.io/projected/799f2d01-917a-4b4d-a870-c364418a5802-kube-api-access-dstx9\") pod \"collect-profiles-29416125-tsn2n\" (UID: \"799f2d01-917a-4b4d-a870-c364418a5802\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n" Dec 05 20:45:00 crc kubenswrapper[4885]: I1205 20:45:00.270727 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/799f2d01-917a-4b4d-a870-c364418a5802-config-volume\") pod \"collect-profiles-29416125-tsn2n\" (UID: \"799f2d01-917a-4b4d-a870-c364418a5802\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n" Dec 05 20:45:00 crc kubenswrapper[4885]: I1205 20:45:00.271557 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/799f2d01-917a-4b4d-a870-c364418a5802-secret-volume\") pod \"collect-profiles-29416125-tsn2n\" (UID: \"799f2d01-917a-4b4d-a870-c364418a5802\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n" Dec 05 20:45:00 crc kubenswrapper[4885]: I1205 20:45:00.272627 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/799f2d01-917a-4b4d-a870-c364418a5802-config-volume\") pod \"collect-profiles-29416125-tsn2n\" (UID: \"799f2d01-917a-4b4d-a870-c364418a5802\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n" Dec 05 20:45:00 crc kubenswrapper[4885]: I1205 20:45:00.281541 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/799f2d01-917a-4b4d-a870-c364418a5802-secret-volume\") pod \"collect-profiles-29416125-tsn2n\" (UID: \"799f2d01-917a-4b4d-a870-c364418a5802\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n" Dec 05 20:45:00 crc kubenswrapper[4885]: I1205 20:45:00.285282 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dstx9\" (UniqueName: \"kubernetes.io/projected/799f2d01-917a-4b4d-a870-c364418a5802-kube-api-access-dstx9\") pod \"collect-profiles-29416125-tsn2n\" (UID: \"799f2d01-917a-4b4d-a870-c364418a5802\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n" Dec 05 20:45:00 crc kubenswrapper[4885]: I1205 20:45:00.490239 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n" Dec 05 20:45:00 crc kubenswrapper[4885]: I1205 20:45:00.924843 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n"] Dec 05 20:45:01 crc kubenswrapper[4885]: I1205 20:45:01.396913 4885 generic.go:334] "Generic (PLEG): container finished" podID="799f2d01-917a-4b4d-a870-c364418a5802" containerID="a6de4e74e7933df814dce48dc583c3474177dff3bd49a4d01dc91e9db5932ccf" exitCode=0 Dec 05 20:45:01 crc kubenswrapper[4885]: I1205 20:45:01.397061 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n" event={"ID":"799f2d01-917a-4b4d-a870-c364418a5802","Type":"ContainerDied","Data":"a6de4e74e7933df814dce48dc583c3474177dff3bd49a4d01dc91e9db5932ccf"} Dec 05 20:45:01 crc kubenswrapper[4885]: I1205 20:45:01.397189 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n" event={"ID":"799f2d01-917a-4b4d-a870-c364418a5802","Type":"ContainerStarted","Data":"078cb720d529a10fb647324e87c6684e7e3118c37d78ec06f01a3f5640619991"} Dec 05 20:45:02 crc kubenswrapper[4885]: I1205 20:45:02.799003 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n" Dec 05 20:45:02 crc kubenswrapper[4885]: I1205 20:45:02.922528 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dstx9\" (UniqueName: \"kubernetes.io/projected/799f2d01-917a-4b4d-a870-c364418a5802-kube-api-access-dstx9\") pod \"799f2d01-917a-4b4d-a870-c364418a5802\" (UID: \"799f2d01-917a-4b4d-a870-c364418a5802\") " Dec 05 20:45:02 crc kubenswrapper[4885]: I1205 20:45:02.922696 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/799f2d01-917a-4b4d-a870-c364418a5802-secret-volume\") pod \"799f2d01-917a-4b4d-a870-c364418a5802\" (UID: \"799f2d01-917a-4b4d-a870-c364418a5802\") " Dec 05 20:45:02 crc kubenswrapper[4885]: I1205 20:45:02.922763 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/799f2d01-917a-4b4d-a870-c364418a5802-config-volume\") pod \"799f2d01-917a-4b4d-a870-c364418a5802\" (UID: \"799f2d01-917a-4b4d-a870-c364418a5802\") " Dec 05 20:45:02 crc kubenswrapper[4885]: I1205 20:45:02.924001 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/799f2d01-917a-4b4d-a870-c364418a5802-config-volume" (OuterVolumeSpecName: "config-volume") pod "799f2d01-917a-4b4d-a870-c364418a5802" (UID: "799f2d01-917a-4b4d-a870-c364418a5802"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:45:02 crc kubenswrapper[4885]: I1205 20:45:02.924589 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/799f2d01-917a-4b4d-a870-c364418a5802-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:02 crc kubenswrapper[4885]: I1205 20:45:02.930056 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799f2d01-917a-4b4d-a870-c364418a5802-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "799f2d01-917a-4b4d-a870-c364418a5802" (UID: "799f2d01-917a-4b4d-a870-c364418a5802"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:02 crc kubenswrapper[4885]: I1205 20:45:02.930597 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/799f2d01-917a-4b4d-a870-c364418a5802-kube-api-access-dstx9" (OuterVolumeSpecName: "kube-api-access-dstx9") pod "799f2d01-917a-4b4d-a870-c364418a5802" (UID: "799f2d01-917a-4b4d-a870-c364418a5802"). InnerVolumeSpecName "kube-api-access-dstx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:45:03 crc kubenswrapper[4885]: I1205 20:45:03.026624 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dstx9\" (UniqueName: \"kubernetes.io/projected/799f2d01-917a-4b4d-a870-c364418a5802-kube-api-access-dstx9\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:03 crc kubenswrapper[4885]: I1205 20:45:03.026990 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/799f2d01-917a-4b4d-a870-c364418a5802-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:03 crc kubenswrapper[4885]: I1205 20:45:03.417440 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n" event={"ID":"799f2d01-917a-4b4d-a870-c364418a5802","Type":"ContainerDied","Data":"078cb720d529a10fb647324e87c6684e7e3118c37d78ec06f01a3f5640619991"} Dec 05 20:45:03 crc kubenswrapper[4885]: I1205 20:45:03.417496 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="078cb720d529a10fb647324e87c6684e7e3118c37d78ec06f01a3f5640619991" Dec 05 20:45:03 crc kubenswrapper[4885]: I1205 20:45:03.417535 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-tsn2n" Dec 05 20:45:03 crc kubenswrapper[4885]: I1205 20:45:03.883834 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p"] Dec 05 20:45:03 crc kubenswrapper[4885]: I1205 20:45:03.894777 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416080-95j7p"] Dec 05 20:45:05 crc kubenswrapper[4885]: I1205 20:45:05.183750 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e2c6d12-1e18-498c-82e4-9c778e7c4aea" path="/var/lib/kubelet/pods/3e2c6d12-1e18-498c-82e4-9c778e7c4aea/volumes" Dec 05 20:45:09 crc kubenswrapper[4885]: I1205 20:45:09.172593 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:45:09 crc kubenswrapper[4885]: E1205 20:45:09.173336 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:45:18 crc kubenswrapper[4885]: I1205 20:45:18.578842 4885 generic.go:334] "Generic (PLEG): container finished" podID="7b51c87e-b603-43e2-bb06-a8e9a0416a59" containerID="409d7cd28f7e50531da0d00718de82e8a6e3c396c3b8d420b2243af08dbaccec" exitCode=0 Dec 05 20:45:18 crc kubenswrapper[4885]: I1205 20:45:18.578916 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" event={"ID":"7b51c87e-b603-43e2-bb06-a8e9a0416a59","Type":"ContainerDied","Data":"409d7cd28f7e50531da0d00718de82e8a6e3c396c3b8d420b2243af08dbaccec"} Dec 05 20:45:19 crc kubenswrapper[4885]: I1205 20:45:19.992515 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.119736 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-libvirt-secret-0\") pod \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.119887 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-inventory\") pod \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.119979 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftb25\" (UniqueName: \"kubernetes.io/projected/7b51c87e-b603-43e2-bb06-a8e9a0416a59-kube-api-access-ftb25\") pod \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.120060 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-libvirt-combined-ca-bundle\") pod \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.120118 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-ssh-key\") pod \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\" (UID: \"7b51c87e-b603-43e2-bb06-a8e9a0416a59\") " Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.126198 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7b51c87e-b603-43e2-bb06-a8e9a0416a59" (UID: "7b51c87e-b603-43e2-bb06-a8e9a0416a59"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.126240 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b51c87e-b603-43e2-bb06-a8e9a0416a59-kube-api-access-ftb25" (OuterVolumeSpecName: "kube-api-access-ftb25") pod "7b51c87e-b603-43e2-bb06-a8e9a0416a59" (UID: "7b51c87e-b603-43e2-bb06-a8e9a0416a59"). InnerVolumeSpecName "kube-api-access-ftb25". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.145908 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "7b51c87e-b603-43e2-bb06-a8e9a0416a59" (UID: "7b51c87e-b603-43e2-bb06-a8e9a0416a59"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.146319 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-inventory" (OuterVolumeSpecName: "inventory") pod "7b51c87e-b603-43e2-bb06-a8e9a0416a59" (UID: "7b51c87e-b603-43e2-bb06-a8e9a0416a59"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.148186 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7b51c87e-b603-43e2-bb06-a8e9a0416a59" (UID: "7b51c87e-b603-43e2-bb06-a8e9a0416a59"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.222668 4885 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.222707 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.222722 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftb25\" (UniqueName: \"kubernetes.io/projected/7b51c87e-b603-43e2-bb06-a8e9a0416a59-kube-api-access-ftb25\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.222736 4885 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.222747 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b51c87e-b603-43e2-bb06-a8e9a0416a59-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.598526 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" event={"ID":"7b51c87e-b603-43e2-bb06-a8e9a0416a59","Type":"ContainerDied","Data":"77b7631a1abece3f76dd750ea705de00ea6713af7798024593cd9513e0097448"} Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.598887 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77b7631a1abece3f76dd750ea705de00ea6713af7798024593cd9513e0097448" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.598605 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.728555 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h"] Dec 05 20:45:20 crc kubenswrapper[4885]: E1205 20:45:20.729016 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="799f2d01-917a-4b4d-a870-c364418a5802" containerName="collect-profiles" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.730379 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="799f2d01-917a-4b4d-a870-c364418a5802" containerName="collect-profiles" Dec 05 20:45:20 crc kubenswrapper[4885]: E1205 20:45:20.730423 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b51c87e-b603-43e2-bb06-a8e9a0416a59" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.730433 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b51c87e-b603-43e2-bb06-a8e9a0416a59" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.730691 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="799f2d01-917a-4b4d-a870-c364418a5802" containerName="collect-profiles" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.730727 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b51c87e-b603-43e2-bb06-a8e9a0416a59" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.731603 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.735166 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.735434 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.735443 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.735569 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.735586 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.735166 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.735442 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.751717 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h"] Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.836767 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.836829 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.836972 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.837094 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.837162 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.837229 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qkq2\" (UniqueName: \"kubernetes.io/projected/453597ee-fc9f-4fc6-beb2-e4c75e1236db-kube-api-access-6qkq2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.837350 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.837403 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.837456 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.939317 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qkq2\" (UniqueName: \"kubernetes.io/projected/453597ee-fc9f-4fc6-beb2-e4c75e1236db-kube-api-access-6qkq2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.939382 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.939409 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.939954 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.940052 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.940080 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.940134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.940163 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.940191 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.941276 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.943532 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.943541 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.943771 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.944396 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.944532 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.949481 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.951644 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:20 crc kubenswrapper[4885]: I1205 20:45:20.961796 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qkq2\" (UniqueName: \"kubernetes.io/projected/453597ee-fc9f-4fc6-beb2-e4c75e1236db-kube-api-access-6qkq2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9j89h\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:21 crc kubenswrapper[4885]: I1205 20:45:21.051628 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:45:21 crc kubenswrapper[4885]: I1205 20:45:21.173580 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:45:21 crc kubenswrapper[4885]: E1205 20:45:21.174178 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:45:21 crc kubenswrapper[4885]: I1205 20:45:21.570177 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h"] Dec 05 20:45:21 crc kubenswrapper[4885]: W1205 20:45:21.576878 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod453597ee_fc9f_4fc6_beb2_e4c75e1236db.slice/crio-8e8bcbc1446c3369dfc9abdd98ca14b060eb51a6482321575c7d02f6f42e76fd WatchSource:0}: Error finding container 8e8bcbc1446c3369dfc9abdd98ca14b060eb51a6482321575c7d02f6f42e76fd: Status 404 returned error can't find the container with id 8e8bcbc1446c3369dfc9abdd98ca14b060eb51a6482321575c7d02f6f42e76fd Dec 05 20:45:21 crc kubenswrapper[4885]: I1205 20:45:21.579230 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:45:21 crc kubenswrapper[4885]: I1205 20:45:21.607069 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" event={"ID":"453597ee-fc9f-4fc6-beb2-e4c75e1236db","Type":"ContainerStarted","Data":"8e8bcbc1446c3369dfc9abdd98ca14b060eb51a6482321575c7d02f6f42e76fd"} Dec 05 20:45:22 crc kubenswrapper[4885]: I1205 20:45:22.618454 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" event={"ID":"453597ee-fc9f-4fc6-beb2-e4c75e1236db","Type":"ContainerStarted","Data":"3f82718d43b639145aa2718f9457e8e490b74ec75c2de6d655d9d206e9fc792e"} Dec 05 20:45:22 crc kubenswrapper[4885]: I1205 20:45:22.650182 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" podStartSLOduration=2.16148475 podStartE2EDuration="2.650160772s" podCreationTimestamp="2025-12-05 20:45:20 +0000 UTC" firstStartedPulling="2025-12-05 20:45:21.578861667 +0000 UTC m=+2386.875677338" lastFinishedPulling="2025-12-05 20:45:22.067537699 +0000 UTC m=+2387.364353360" observedRunningTime="2025-12-05 20:45:22.639521477 +0000 UTC m=+2387.936337128" watchObservedRunningTime="2025-12-05 20:45:22.650160772 +0000 UTC m=+2387.946976433" Dec 05 20:45:32 crc kubenswrapper[4885]: I1205 20:45:32.173463 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:45:32 crc kubenswrapper[4885]: E1205 20:45:32.174350 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:45:46 crc kubenswrapper[4885]: I1205 20:45:46.173066 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:45:46 crc kubenswrapper[4885]: E1205 20:45:46.174208 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:45:55 crc kubenswrapper[4885]: I1205 20:45:55.582449 4885 scope.go:117] "RemoveContainer" containerID="235e6c66258d2b260840a0b140f97d224d738da39eaf02e97a84ddab1029330f" Dec 05 20:46:01 crc kubenswrapper[4885]: I1205 20:46:01.172421 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:46:01 crc kubenswrapper[4885]: E1205 20:46:01.173215 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:46:15 crc kubenswrapper[4885]: I1205 20:46:15.181181 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:46:15 crc kubenswrapper[4885]: E1205 20:46:15.182159 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:46:26 crc kubenswrapper[4885]: I1205 20:46:26.172286 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:46:26 crc kubenswrapper[4885]: E1205 20:46:26.173198 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:46:38 crc kubenswrapper[4885]: I1205 20:46:38.173637 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:46:38 crc kubenswrapper[4885]: E1205 20:46:38.174945 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:46:42 crc kubenswrapper[4885]: I1205 20:46:42.769751 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-75hsh"] Dec 05 20:46:42 crc kubenswrapper[4885]: I1205 20:46:42.772728 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:42 crc kubenswrapper[4885]: I1205 20:46:42.791315 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-75hsh"] Dec 05 20:46:42 crc kubenswrapper[4885]: I1205 20:46:42.869895 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e9f0373-7c85-4291-8125-8d3b9c00656d-catalog-content\") pod \"certified-operators-75hsh\" (UID: \"7e9f0373-7c85-4291-8125-8d3b9c00656d\") " pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:42 crc kubenswrapper[4885]: I1205 20:46:42.869971 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk4cq\" (UniqueName: \"kubernetes.io/projected/7e9f0373-7c85-4291-8125-8d3b9c00656d-kube-api-access-rk4cq\") pod \"certified-operators-75hsh\" (UID: \"7e9f0373-7c85-4291-8125-8d3b9c00656d\") " pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:42 crc kubenswrapper[4885]: I1205 20:46:42.870397 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e9f0373-7c85-4291-8125-8d3b9c00656d-utilities\") pod \"certified-operators-75hsh\" (UID: \"7e9f0373-7c85-4291-8125-8d3b9c00656d\") " pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:42 crc kubenswrapper[4885]: I1205 20:46:42.969247 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hjrjf"] Dec 05 20:46:42 crc kubenswrapper[4885]: I1205 20:46:42.971101 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:42 crc kubenswrapper[4885]: I1205 20:46:42.971695 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e9f0373-7c85-4291-8125-8d3b9c00656d-utilities\") pod \"certified-operators-75hsh\" (UID: \"7e9f0373-7c85-4291-8125-8d3b9c00656d\") " pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:42 crc kubenswrapper[4885]: I1205 20:46:42.971806 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e9f0373-7c85-4291-8125-8d3b9c00656d-catalog-content\") pod \"certified-operators-75hsh\" (UID: \"7e9f0373-7c85-4291-8125-8d3b9c00656d\") " pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:42 crc kubenswrapper[4885]: I1205 20:46:42.971851 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk4cq\" (UniqueName: \"kubernetes.io/projected/7e9f0373-7c85-4291-8125-8d3b9c00656d-kube-api-access-rk4cq\") pod \"certified-operators-75hsh\" (UID: \"7e9f0373-7c85-4291-8125-8d3b9c00656d\") " pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:42 crc kubenswrapper[4885]: I1205 20:46:42.972230 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e9f0373-7c85-4291-8125-8d3b9c00656d-utilities\") pod \"certified-operators-75hsh\" (UID: \"7e9f0373-7c85-4291-8125-8d3b9c00656d\") " pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:42 crc kubenswrapper[4885]: I1205 20:46:42.972410 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e9f0373-7c85-4291-8125-8d3b9c00656d-catalog-content\") pod \"certified-operators-75hsh\" (UID: \"7e9f0373-7c85-4291-8125-8d3b9c00656d\") " pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:42 crc kubenswrapper[4885]: I1205 20:46:42.986938 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hjrjf"] Dec 05 20:46:43 crc kubenswrapper[4885]: I1205 20:46:43.006903 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk4cq\" (UniqueName: \"kubernetes.io/projected/7e9f0373-7c85-4291-8125-8d3b9c00656d-kube-api-access-rk4cq\") pod \"certified-operators-75hsh\" (UID: \"7e9f0373-7c85-4291-8125-8d3b9c00656d\") " pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:43 crc kubenswrapper[4885]: I1205 20:46:43.073839 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de2e7665-d55c-4cd5-8cfc-7768a953c353-catalog-content\") pod \"community-operators-hjrjf\" (UID: \"de2e7665-d55c-4cd5-8cfc-7768a953c353\") " pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:43 crc kubenswrapper[4885]: I1205 20:46:43.073893 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gbvc\" (UniqueName: \"kubernetes.io/projected/de2e7665-d55c-4cd5-8cfc-7768a953c353-kube-api-access-2gbvc\") pod \"community-operators-hjrjf\" (UID: \"de2e7665-d55c-4cd5-8cfc-7768a953c353\") " pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:43 crc kubenswrapper[4885]: I1205 20:46:43.073960 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de2e7665-d55c-4cd5-8cfc-7768a953c353-utilities\") pod \"community-operators-hjrjf\" (UID: \"de2e7665-d55c-4cd5-8cfc-7768a953c353\") " pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:43 crc kubenswrapper[4885]: I1205 20:46:43.101478 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:43 crc kubenswrapper[4885]: I1205 20:46:43.175347 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de2e7665-d55c-4cd5-8cfc-7768a953c353-utilities\") pod \"community-operators-hjrjf\" (UID: \"de2e7665-d55c-4cd5-8cfc-7768a953c353\") " pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:43 crc kubenswrapper[4885]: I1205 20:46:43.175822 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de2e7665-d55c-4cd5-8cfc-7768a953c353-catalog-content\") pod \"community-operators-hjrjf\" (UID: \"de2e7665-d55c-4cd5-8cfc-7768a953c353\") " pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:43 crc kubenswrapper[4885]: I1205 20:46:43.175858 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gbvc\" (UniqueName: \"kubernetes.io/projected/de2e7665-d55c-4cd5-8cfc-7768a953c353-kube-api-access-2gbvc\") pod \"community-operators-hjrjf\" (UID: \"de2e7665-d55c-4cd5-8cfc-7768a953c353\") " pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:43 crc kubenswrapper[4885]: I1205 20:46:43.176283 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de2e7665-d55c-4cd5-8cfc-7768a953c353-utilities\") pod \"community-operators-hjrjf\" (UID: \"de2e7665-d55c-4cd5-8cfc-7768a953c353\") " pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:43 crc kubenswrapper[4885]: I1205 20:46:43.176541 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de2e7665-d55c-4cd5-8cfc-7768a953c353-catalog-content\") pod \"community-operators-hjrjf\" (UID: \"de2e7665-d55c-4cd5-8cfc-7768a953c353\") " pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:43 crc kubenswrapper[4885]: I1205 20:46:43.207120 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gbvc\" (UniqueName: \"kubernetes.io/projected/de2e7665-d55c-4cd5-8cfc-7768a953c353-kube-api-access-2gbvc\") pod \"community-operators-hjrjf\" (UID: \"de2e7665-d55c-4cd5-8cfc-7768a953c353\") " pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:43 crc kubenswrapper[4885]: I1205 20:46:43.296495 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:43 crc kubenswrapper[4885]: I1205 20:46:43.711840 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-75hsh"] Dec 05 20:46:43 crc kubenswrapper[4885]: I1205 20:46:43.970104 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hjrjf"] Dec 05 20:46:44 crc kubenswrapper[4885]: W1205 20:46:44.002503 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde2e7665_d55c_4cd5_8cfc_7768a953c353.slice/crio-72e492a6fd398f12f2db93f6403a5af590c9a8c7ad0631928c5249de18ee61d4 WatchSource:0}: Error finding container 72e492a6fd398f12f2db93f6403a5af590c9a8c7ad0631928c5249de18ee61d4: Status 404 returned error can't find the container with id 72e492a6fd398f12f2db93f6403a5af590c9a8c7ad0631928c5249de18ee61d4 Dec 05 20:46:44 crc kubenswrapper[4885]: I1205 20:46:44.407890 4885 generic.go:334] "Generic (PLEG): container finished" podID="de2e7665-d55c-4cd5-8cfc-7768a953c353" containerID="566f739824957d6271cc54fbca7a99fa8890e35f3616002494a54a91353d2dd6" exitCode=0 Dec 05 20:46:44 crc kubenswrapper[4885]: I1205 20:46:44.407959 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjrjf" event={"ID":"de2e7665-d55c-4cd5-8cfc-7768a953c353","Type":"ContainerDied","Data":"566f739824957d6271cc54fbca7a99fa8890e35f3616002494a54a91353d2dd6"} Dec 05 20:46:44 crc kubenswrapper[4885]: I1205 20:46:44.408239 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjrjf" event={"ID":"de2e7665-d55c-4cd5-8cfc-7768a953c353","Type":"ContainerStarted","Data":"72e492a6fd398f12f2db93f6403a5af590c9a8c7ad0631928c5249de18ee61d4"} Dec 05 20:46:44 crc kubenswrapper[4885]: I1205 20:46:44.410494 4885 generic.go:334] "Generic (PLEG): container finished" podID="7e9f0373-7c85-4291-8125-8d3b9c00656d" containerID="377c15e5334f68c24a2c4ac34e0e48ec93f0c415593d6578a5fa9bbd55097b30" exitCode=0 Dec 05 20:46:44 crc kubenswrapper[4885]: I1205 20:46:44.410527 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hsh" event={"ID":"7e9f0373-7c85-4291-8125-8d3b9c00656d","Type":"ContainerDied","Data":"377c15e5334f68c24a2c4ac34e0e48ec93f0c415593d6578a5fa9bbd55097b30"} Dec 05 20:46:44 crc kubenswrapper[4885]: I1205 20:46:44.410547 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hsh" event={"ID":"7e9f0373-7c85-4291-8125-8d3b9c00656d","Type":"ContainerStarted","Data":"fad3d7a5ee29e82092b0b69b0027bdcf951c9710cff294bded18f2204dc309a9"} Dec 05 20:46:45 crc kubenswrapper[4885]: I1205 20:46:45.420735 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjrjf" event={"ID":"de2e7665-d55c-4cd5-8cfc-7768a953c353","Type":"ContainerStarted","Data":"89487ae9d61b276830d13cfa0d4760146e5cd27d9b38690099e75ebcddce2854"} Dec 05 20:46:46 crc kubenswrapper[4885]: I1205 20:46:46.441180 4885 generic.go:334] "Generic (PLEG): container finished" podID="de2e7665-d55c-4cd5-8cfc-7768a953c353" containerID="89487ae9d61b276830d13cfa0d4760146e5cd27d9b38690099e75ebcddce2854" exitCode=0 Dec 05 20:46:46 crc kubenswrapper[4885]: I1205 20:46:46.441236 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjrjf" event={"ID":"de2e7665-d55c-4cd5-8cfc-7768a953c353","Type":"ContainerDied","Data":"89487ae9d61b276830d13cfa0d4760146e5cd27d9b38690099e75ebcddce2854"} Dec 05 20:46:46 crc kubenswrapper[4885]: I1205 20:46:46.451771 4885 generic.go:334] "Generic (PLEG): container finished" podID="7e9f0373-7c85-4291-8125-8d3b9c00656d" containerID="51363f553a08254f297c094ce2a0b5f3f0c0a54cb6d20c24d97e24b4af2c1cba" exitCode=0 Dec 05 20:46:46 crc kubenswrapper[4885]: I1205 20:46:46.451820 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hsh" event={"ID":"7e9f0373-7c85-4291-8125-8d3b9c00656d","Type":"ContainerDied","Data":"51363f553a08254f297c094ce2a0b5f3f0c0a54cb6d20c24d97e24b4af2c1cba"} Dec 05 20:46:47 crc kubenswrapper[4885]: I1205 20:46:47.467345 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjrjf" event={"ID":"de2e7665-d55c-4cd5-8cfc-7768a953c353","Type":"ContainerStarted","Data":"a775dd335e4d7375ebcd4149aeb8cf4eb5dca8d2aa4b0e9c707dfc80ed04bb5e"} Dec 05 20:46:47 crc kubenswrapper[4885]: I1205 20:46:47.470643 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hsh" event={"ID":"7e9f0373-7c85-4291-8125-8d3b9c00656d","Type":"ContainerStarted","Data":"4ed91500b39b28b2030e7548628ba09073f32289497b5ce0a37851b29240ed8f"} Dec 05 20:46:47 crc kubenswrapper[4885]: I1205 20:46:47.492884 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hjrjf" podStartSLOduration=3.02340952 podStartE2EDuration="5.492864448s" podCreationTimestamp="2025-12-05 20:46:42 +0000 UTC" firstStartedPulling="2025-12-05 20:46:44.410093913 +0000 UTC m=+2469.706909564" lastFinishedPulling="2025-12-05 20:46:46.879548831 +0000 UTC m=+2472.176364492" observedRunningTime="2025-12-05 20:46:47.48531681 +0000 UTC m=+2472.782132471" watchObservedRunningTime="2025-12-05 20:46:47.492864448 +0000 UTC m=+2472.789680129" Dec 05 20:46:53 crc kubenswrapper[4885]: I1205 20:46:53.102404 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:53 crc kubenswrapper[4885]: I1205 20:46:53.102992 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:53 crc kubenswrapper[4885]: I1205 20:46:53.156662 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:53 crc kubenswrapper[4885]: I1205 20:46:53.173505 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:46:53 crc kubenswrapper[4885]: E1205 20:46:53.173779 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:46:53 crc kubenswrapper[4885]: I1205 20:46:53.179759 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-75hsh" podStartSLOduration=8.716306509 podStartE2EDuration="11.179741277s" podCreationTimestamp="2025-12-05 20:46:42 +0000 UTC" firstStartedPulling="2025-12-05 20:46:44.411704814 +0000 UTC m=+2469.708520475" lastFinishedPulling="2025-12-05 20:46:46.875139582 +0000 UTC m=+2472.171955243" observedRunningTime="2025-12-05 20:46:47.510920737 +0000 UTC m=+2472.807736398" watchObservedRunningTime="2025-12-05 20:46:53.179741277 +0000 UTC m=+2478.476556928" Dec 05 20:46:53 crc kubenswrapper[4885]: I1205 20:46:53.297940 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:53 crc kubenswrapper[4885]: I1205 20:46:53.298222 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:53 crc kubenswrapper[4885]: I1205 20:46:53.344388 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:53 crc kubenswrapper[4885]: I1205 20:46:53.573225 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:53 crc kubenswrapper[4885]: I1205 20:46:53.585566 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:55 crc kubenswrapper[4885]: I1205 20:46:55.000797 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-75hsh"] Dec 05 20:46:55 crc kubenswrapper[4885]: I1205 20:46:55.549771 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-75hsh" podUID="7e9f0373-7c85-4291-8125-8d3b9c00656d" containerName="registry-server" containerID="cri-o://4ed91500b39b28b2030e7548628ba09073f32289497b5ce0a37851b29240ed8f" gracePeriod=2 Dec 05 20:46:55 crc kubenswrapper[4885]: I1205 20:46:55.994933 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hjrjf"] Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.006577 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.060481 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk4cq\" (UniqueName: \"kubernetes.io/projected/7e9f0373-7c85-4291-8125-8d3b9c00656d-kube-api-access-rk4cq\") pod \"7e9f0373-7c85-4291-8125-8d3b9c00656d\" (UID: \"7e9f0373-7c85-4291-8125-8d3b9c00656d\") " Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.060839 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e9f0373-7c85-4291-8125-8d3b9c00656d-utilities\") pod \"7e9f0373-7c85-4291-8125-8d3b9c00656d\" (UID: \"7e9f0373-7c85-4291-8125-8d3b9c00656d\") " Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.060902 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e9f0373-7c85-4291-8125-8d3b9c00656d-catalog-content\") pod \"7e9f0373-7c85-4291-8125-8d3b9c00656d\" (UID: \"7e9f0373-7c85-4291-8125-8d3b9c00656d\") " Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.061945 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e9f0373-7c85-4291-8125-8d3b9c00656d-utilities" (OuterVolumeSpecName: "utilities") pod "7e9f0373-7c85-4291-8125-8d3b9c00656d" (UID: "7e9f0373-7c85-4291-8125-8d3b9c00656d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.068274 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9f0373-7c85-4291-8125-8d3b9c00656d-kube-api-access-rk4cq" (OuterVolumeSpecName: "kube-api-access-rk4cq") pod "7e9f0373-7c85-4291-8125-8d3b9c00656d" (UID: "7e9f0373-7c85-4291-8125-8d3b9c00656d"). InnerVolumeSpecName "kube-api-access-rk4cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.162641 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk4cq\" (UniqueName: \"kubernetes.io/projected/7e9f0373-7c85-4291-8125-8d3b9c00656d-kube-api-access-rk4cq\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.162676 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e9f0373-7c85-4291-8125-8d3b9c00656d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.564088 4885 generic.go:334] "Generic (PLEG): container finished" podID="7e9f0373-7c85-4291-8125-8d3b9c00656d" containerID="4ed91500b39b28b2030e7548628ba09073f32289497b5ce0a37851b29240ed8f" exitCode=0 Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.564416 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hjrjf" podUID="de2e7665-d55c-4cd5-8cfc-7768a953c353" containerName="registry-server" containerID="cri-o://a775dd335e4d7375ebcd4149aeb8cf4eb5dca8d2aa4b0e9c707dfc80ed04bb5e" gracePeriod=2 Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.564578 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75hsh" Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.565017 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hsh" event={"ID":"7e9f0373-7c85-4291-8125-8d3b9c00656d","Type":"ContainerDied","Data":"4ed91500b39b28b2030e7548628ba09073f32289497b5ce0a37851b29240ed8f"} Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.565081 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hsh" event={"ID":"7e9f0373-7c85-4291-8125-8d3b9c00656d","Type":"ContainerDied","Data":"fad3d7a5ee29e82092b0b69b0027bdcf951c9710cff294bded18f2204dc309a9"} Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.565102 4885 scope.go:117] "RemoveContainer" containerID="4ed91500b39b28b2030e7548628ba09073f32289497b5ce0a37851b29240ed8f" Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.597362 4885 scope.go:117] "RemoveContainer" containerID="51363f553a08254f297c094ce2a0b5f3f0c0a54cb6d20c24d97e24b4af2c1cba" Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.627665 4885 scope.go:117] "RemoveContainer" containerID="377c15e5334f68c24a2c4ac34e0e48ec93f0c415593d6578a5fa9bbd55097b30" Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.669793 4885 scope.go:117] "RemoveContainer" containerID="4ed91500b39b28b2030e7548628ba09073f32289497b5ce0a37851b29240ed8f" Dec 05 20:46:56 crc kubenswrapper[4885]: E1205 20:46:56.670200 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ed91500b39b28b2030e7548628ba09073f32289497b5ce0a37851b29240ed8f\": container with ID starting with 4ed91500b39b28b2030e7548628ba09073f32289497b5ce0a37851b29240ed8f not found: ID does not exist" containerID="4ed91500b39b28b2030e7548628ba09073f32289497b5ce0a37851b29240ed8f" Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.670265 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed91500b39b28b2030e7548628ba09073f32289497b5ce0a37851b29240ed8f"} err="failed to get container status \"4ed91500b39b28b2030e7548628ba09073f32289497b5ce0a37851b29240ed8f\": rpc error: code = NotFound desc = could not find container \"4ed91500b39b28b2030e7548628ba09073f32289497b5ce0a37851b29240ed8f\": container with ID starting with 4ed91500b39b28b2030e7548628ba09073f32289497b5ce0a37851b29240ed8f not found: ID does not exist" Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.670301 4885 scope.go:117] "RemoveContainer" containerID="51363f553a08254f297c094ce2a0b5f3f0c0a54cb6d20c24d97e24b4af2c1cba" Dec 05 20:46:56 crc kubenswrapper[4885]: E1205 20:46:56.670608 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51363f553a08254f297c094ce2a0b5f3f0c0a54cb6d20c24d97e24b4af2c1cba\": container with ID starting with 51363f553a08254f297c094ce2a0b5f3f0c0a54cb6d20c24d97e24b4af2c1cba not found: ID does not exist" containerID="51363f553a08254f297c094ce2a0b5f3f0c0a54cb6d20c24d97e24b4af2c1cba" Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.670665 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51363f553a08254f297c094ce2a0b5f3f0c0a54cb6d20c24d97e24b4af2c1cba"} err="failed to get container status \"51363f553a08254f297c094ce2a0b5f3f0c0a54cb6d20c24d97e24b4af2c1cba\": rpc error: code = NotFound desc = could not find container \"51363f553a08254f297c094ce2a0b5f3f0c0a54cb6d20c24d97e24b4af2c1cba\": container with ID starting with 51363f553a08254f297c094ce2a0b5f3f0c0a54cb6d20c24d97e24b4af2c1cba not found: ID does not exist" Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.670691 4885 scope.go:117] "RemoveContainer" containerID="377c15e5334f68c24a2c4ac34e0e48ec93f0c415593d6578a5fa9bbd55097b30" Dec 05 20:46:56 crc kubenswrapper[4885]: E1205 20:46:56.671065 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"377c15e5334f68c24a2c4ac34e0e48ec93f0c415593d6578a5fa9bbd55097b30\": container with ID starting with 377c15e5334f68c24a2c4ac34e0e48ec93f0c415593d6578a5fa9bbd55097b30 not found: ID does not exist" containerID="377c15e5334f68c24a2c4ac34e0e48ec93f0c415593d6578a5fa9bbd55097b30" Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.671171 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"377c15e5334f68c24a2c4ac34e0e48ec93f0c415593d6578a5fa9bbd55097b30"} err="failed to get container status \"377c15e5334f68c24a2c4ac34e0e48ec93f0c415593d6578a5fa9bbd55097b30\": rpc error: code = NotFound desc = could not find container \"377c15e5334f68c24a2c4ac34e0e48ec93f0c415593d6578a5fa9bbd55097b30\": container with ID starting with 377c15e5334f68c24a2c4ac34e0e48ec93f0c415593d6578a5fa9bbd55097b30 not found: ID does not exist" Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.953203 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e9f0373-7c85-4291-8125-8d3b9c00656d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e9f0373-7c85-4291-8125-8d3b9c00656d" (UID: "7e9f0373-7c85-4291-8125-8d3b9c00656d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:46:56 crc kubenswrapper[4885]: I1205 20:46:56.979361 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e9f0373-7c85-4291-8125-8d3b9c00656d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:57 crc kubenswrapper[4885]: E1205 20:46:57.050886 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde2e7665_d55c_4cd5_8cfc_7768a953c353.slice/crio-a775dd335e4d7375ebcd4149aeb8cf4eb5dca8d2aa4b0e9c707dfc80ed04bb5e.scope\": RecentStats: unable to find data in memory cache]" Dec 05 20:46:57 crc kubenswrapper[4885]: I1205 20:46:57.199290 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-75hsh"] Dec 05 20:46:57 crc kubenswrapper[4885]: I1205 20:46:57.208515 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-75hsh"] Dec 05 20:46:57 crc kubenswrapper[4885]: I1205 20:46:57.576665 4885 generic.go:334] "Generic (PLEG): container finished" podID="de2e7665-d55c-4cd5-8cfc-7768a953c353" containerID="a775dd335e4d7375ebcd4149aeb8cf4eb5dca8d2aa4b0e9c707dfc80ed04bb5e" exitCode=0 Dec 05 20:46:57 crc kubenswrapper[4885]: I1205 20:46:57.576779 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjrjf" event={"ID":"de2e7665-d55c-4cd5-8cfc-7768a953c353","Type":"ContainerDied","Data":"a775dd335e4d7375ebcd4149aeb8cf4eb5dca8d2aa4b0e9c707dfc80ed04bb5e"} Dec 05 20:46:57 crc kubenswrapper[4885]: I1205 20:46:57.862121 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:57 crc kubenswrapper[4885]: I1205 20:46:57.896581 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de2e7665-d55c-4cd5-8cfc-7768a953c353-utilities\") pod \"de2e7665-d55c-4cd5-8cfc-7768a953c353\" (UID: \"de2e7665-d55c-4cd5-8cfc-7768a953c353\") " Dec 05 20:46:57 crc kubenswrapper[4885]: I1205 20:46:57.896943 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de2e7665-d55c-4cd5-8cfc-7768a953c353-catalog-content\") pod \"de2e7665-d55c-4cd5-8cfc-7768a953c353\" (UID: \"de2e7665-d55c-4cd5-8cfc-7768a953c353\") " Dec 05 20:46:57 crc kubenswrapper[4885]: I1205 20:46:57.897246 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gbvc\" (UniqueName: \"kubernetes.io/projected/de2e7665-d55c-4cd5-8cfc-7768a953c353-kube-api-access-2gbvc\") pod \"de2e7665-d55c-4cd5-8cfc-7768a953c353\" (UID: \"de2e7665-d55c-4cd5-8cfc-7768a953c353\") " Dec 05 20:46:57 crc kubenswrapper[4885]: I1205 20:46:57.897470 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de2e7665-d55c-4cd5-8cfc-7768a953c353-utilities" (OuterVolumeSpecName: "utilities") pod "de2e7665-d55c-4cd5-8cfc-7768a953c353" (UID: "de2e7665-d55c-4cd5-8cfc-7768a953c353"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:46:57 crc kubenswrapper[4885]: I1205 20:46:57.899461 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de2e7665-d55c-4cd5-8cfc-7768a953c353-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:57 crc kubenswrapper[4885]: I1205 20:46:57.904512 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de2e7665-d55c-4cd5-8cfc-7768a953c353-kube-api-access-2gbvc" (OuterVolumeSpecName: "kube-api-access-2gbvc") pod "de2e7665-d55c-4cd5-8cfc-7768a953c353" (UID: "de2e7665-d55c-4cd5-8cfc-7768a953c353"). InnerVolumeSpecName "kube-api-access-2gbvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:46:57 crc kubenswrapper[4885]: I1205 20:46:57.953046 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de2e7665-d55c-4cd5-8cfc-7768a953c353-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de2e7665-d55c-4cd5-8cfc-7768a953c353" (UID: "de2e7665-d55c-4cd5-8cfc-7768a953c353"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:46:58 crc kubenswrapper[4885]: I1205 20:46:58.002051 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de2e7665-d55c-4cd5-8cfc-7768a953c353-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:58 crc kubenswrapper[4885]: I1205 20:46:58.002099 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gbvc\" (UniqueName: \"kubernetes.io/projected/de2e7665-d55c-4cd5-8cfc-7768a953c353-kube-api-access-2gbvc\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:58 crc kubenswrapper[4885]: I1205 20:46:58.592492 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjrjf" event={"ID":"de2e7665-d55c-4cd5-8cfc-7768a953c353","Type":"ContainerDied","Data":"72e492a6fd398f12f2db93f6403a5af590c9a8c7ad0631928c5249de18ee61d4"} Dec 05 20:46:58 crc kubenswrapper[4885]: I1205 20:46:58.592559 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjrjf" Dec 05 20:46:58 crc kubenswrapper[4885]: I1205 20:46:58.592910 4885 scope.go:117] "RemoveContainer" containerID="a775dd335e4d7375ebcd4149aeb8cf4eb5dca8d2aa4b0e9c707dfc80ed04bb5e" Dec 05 20:46:58 crc kubenswrapper[4885]: I1205 20:46:58.641723 4885 scope.go:117] "RemoveContainer" containerID="89487ae9d61b276830d13cfa0d4760146e5cd27d9b38690099e75ebcddce2854" Dec 05 20:46:58 crc kubenswrapper[4885]: I1205 20:46:58.647201 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hjrjf"] Dec 05 20:46:58 crc kubenswrapper[4885]: I1205 20:46:58.656273 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hjrjf"] Dec 05 20:46:58 crc kubenswrapper[4885]: I1205 20:46:58.676386 4885 scope.go:117] "RemoveContainer" containerID="566f739824957d6271cc54fbca7a99fa8890e35f3616002494a54a91353d2dd6" Dec 05 20:46:59 crc kubenswrapper[4885]: I1205 20:46:59.199545 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e9f0373-7c85-4291-8125-8d3b9c00656d" path="/var/lib/kubelet/pods/7e9f0373-7c85-4291-8125-8d3b9c00656d/volumes" Dec 05 20:46:59 crc kubenswrapper[4885]: I1205 20:46:59.200211 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de2e7665-d55c-4cd5-8cfc-7768a953c353" path="/var/lib/kubelet/pods/de2e7665-d55c-4cd5-8cfc-7768a953c353/volumes" Dec 05 20:47:06 crc kubenswrapper[4885]: I1205 20:47:06.174176 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:47:06 crc kubenswrapper[4885]: E1205 20:47:06.174954 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:47:19 crc kubenswrapper[4885]: I1205 20:47:19.173320 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:47:19 crc kubenswrapper[4885]: E1205 20:47:19.174185 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:47:34 crc kubenswrapper[4885]: I1205 20:47:34.173089 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:47:34 crc kubenswrapper[4885]: E1205 20:47:34.174219 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:47:46 crc kubenswrapper[4885]: I1205 20:47:46.173712 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:47:46 crc kubenswrapper[4885]: E1205 20:47:46.175199 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:48:01 crc kubenswrapper[4885]: I1205 20:48:01.174239 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:48:02 crc kubenswrapper[4885]: I1205 20:48:02.282428 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"5472b896165a120797b4837756fc5f6fc90406538f96f516b4ccfa0b788d4fb5"} Dec 05 20:48:07 crc kubenswrapper[4885]: I1205 20:48:07.336466 4885 generic.go:334] "Generic (PLEG): container finished" podID="453597ee-fc9f-4fc6-beb2-e4c75e1236db" containerID="3f82718d43b639145aa2718f9457e8e490b74ec75c2de6d655d9d206e9fc792e" exitCode=0 Dec 05 20:48:07 crc kubenswrapper[4885]: I1205 20:48:07.336590 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" event={"ID":"453597ee-fc9f-4fc6-beb2-e4c75e1236db","Type":"ContainerDied","Data":"3f82718d43b639145aa2718f9457e8e490b74ec75c2de6d655d9d206e9fc792e"} Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.884165 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.916440 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qkq2\" (UniqueName: \"kubernetes.io/projected/453597ee-fc9f-4fc6-beb2-e4c75e1236db-kube-api-access-6qkq2\") pod \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.916521 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-extra-config-0\") pod \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.916547 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-inventory\") pod \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.916583 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-migration-ssh-key-0\") pod \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.916633 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-combined-ca-bundle\") pod \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.916682 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-cell1-compute-config-1\") pod \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.916723 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-migration-ssh-key-1\") pod \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.916774 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-cell1-compute-config-0\") pod \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.916838 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-ssh-key\") pod \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\" (UID: \"453597ee-fc9f-4fc6-beb2-e4c75e1236db\") " Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.926297 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/453597ee-fc9f-4fc6-beb2-e4c75e1236db-kube-api-access-6qkq2" (OuterVolumeSpecName: "kube-api-access-6qkq2") pod "453597ee-fc9f-4fc6-beb2-e4c75e1236db" (UID: "453597ee-fc9f-4fc6-beb2-e4c75e1236db"). InnerVolumeSpecName "kube-api-access-6qkq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.945210 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "453597ee-fc9f-4fc6-beb2-e4c75e1236db" (UID: "453597ee-fc9f-4fc6-beb2-e4c75e1236db"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.946525 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "453597ee-fc9f-4fc6-beb2-e4c75e1236db" (UID: "453597ee-fc9f-4fc6-beb2-e4c75e1236db"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.950901 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "453597ee-fc9f-4fc6-beb2-e4c75e1236db" (UID: "453597ee-fc9f-4fc6-beb2-e4c75e1236db"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.953157 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "453597ee-fc9f-4fc6-beb2-e4c75e1236db" (UID: "453597ee-fc9f-4fc6-beb2-e4c75e1236db"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.956306 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-inventory" (OuterVolumeSpecName: "inventory") pod "453597ee-fc9f-4fc6-beb2-e4c75e1236db" (UID: "453597ee-fc9f-4fc6-beb2-e4c75e1236db"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.965615 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "453597ee-fc9f-4fc6-beb2-e4c75e1236db" (UID: "453597ee-fc9f-4fc6-beb2-e4c75e1236db"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.966010 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "453597ee-fc9f-4fc6-beb2-e4c75e1236db" (UID: "453597ee-fc9f-4fc6-beb2-e4c75e1236db"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:48:08 crc kubenswrapper[4885]: I1205 20:48:08.978768 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "453597ee-fc9f-4fc6-beb2-e4c75e1236db" (UID: "453597ee-fc9f-4fc6-beb2-e4c75e1236db"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.018797 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.018833 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qkq2\" (UniqueName: \"kubernetes.io/projected/453597ee-fc9f-4fc6-beb2-e4c75e1236db-kube-api-access-6qkq2\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.018844 4885 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.018855 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.018864 4885 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.018872 4885 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.018882 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.018913 4885 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.018924 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/453597ee-fc9f-4fc6-beb2-e4c75e1236db-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.362276 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" event={"ID":"453597ee-fc9f-4fc6-beb2-e4c75e1236db","Type":"ContainerDied","Data":"8e8bcbc1446c3369dfc9abdd98ca14b060eb51a6482321575c7d02f6f42e76fd"} Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.362320 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e8bcbc1446c3369dfc9abdd98ca14b060eb51a6482321575c7d02f6f42e76fd" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.362421 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9j89h" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.462552 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m"] Dec 05 20:48:09 crc kubenswrapper[4885]: E1205 20:48:09.462976 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de2e7665-d55c-4cd5-8cfc-7768a953c353" containerName="extract-utilities" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.462998 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2e7665-d55c-4cd5-8cfc-7768a953c353" containerName="extract-utilities" Dec 05 20:48:09 crc kubenswrapper[4885]: E1205 20:48:09.463035 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de2e7665-d55c-4cd5-8cfc-7768a953c353" containerName="registry-server" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.463044 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2e7665-d55c-4cd5-8cfc-7768a953c353" containerName="registry-server" Dec 05 20:48:09 crc kubenswrapper[4885]: E1205 20:48:09.463061 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9f0373-7c85-4291-8125-8d3b9c00656d" containerName="extract-utilities" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.463068 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9f0373-7c85-4291-8125-8d3b9c00656d" containerName="extract-utilities" Dec 05 20:48:09 crc kubenswrapper[4885]: E1205 20:48:09.463079 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9f0373-7c85-4291-8125-8d3b9c00656d" containerName="registry-server" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.463085 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9f0373-7c85-4291-8125-8d3b9c00656d" containerName="registry-server" Dec 05 20:48:09 crc kubenswrapper[4885]: E1205 20:48:09.463102 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9f0373-7c85-4291-8125-8d3b9c00656d" containerName="extract-content" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.463107 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9f0373-7c85-4291-8125-8d3b9c00656d" containerName="extract-content" Dec 05 20:48:09 crc kubenswrapper[4885]: E1205 20:48:09.463126 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453597ee-fc9f-4fc6-beb2-e4c75e1236db" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.463132 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="453597ee-fc9f-4fc6-beb2-e4c75e1236db" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 05 20:48:09 crc kubenswrapper[4885]: E1205 20:48:09.463144 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de2e7665-d55c-4cd5-8cfc-7768a953c353" containerName="extract-content" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.463151 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2e7665-d55c-4cd5-8cfc-7768a953c353" containerName="extract-content" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.463316 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e9f0373-7c85-4291-8125-8d3b9c00656d" containerName="registry-server" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.463333 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="de2e7665-d55c-4cd5-8cfc-7768a953c353" containerName="registry-server" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.463347 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="453597ee-fc9f-4fc6-beb2-e4c75e1236db" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.464089 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.467529 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.467708 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jgfb9" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.467872 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.468016 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.468065 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.476719 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m"] Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.528582 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mprbn\" (UniqueName: \"kubernetes.io/projected/d6e72054-a861-40ce-b2c9-6212896baaf4-kube-api-access-mprbn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.528675 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.528700 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.528777 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.528799 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.528817 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.528832 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.630605 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.631335 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.631429 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.631470 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.631489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.631507 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.631575 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mprbn\" (UniqueName: \"kubernetes.io/projected/d6e72054-a861-40ce-b2c9-6212896baaf4-kube-api-access-mprbn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.636817 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.637299 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.637565 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.637719 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.638481 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.640792 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.648958 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mprbn\" (UniqueName: \"kubernetes.io/projected/d6e72054-a861-40ce-b2c9-6212896baaf4-kube-api-access-mprbn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5m26m\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:09 crc kubenswrapper[4885]: I1205 20:48:09.840239 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:48:10 crc kubenswrapper[4885]: I1205 20:48:10.386649 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m"] Dec 05 20:48:11 crc kubenswrapper[4885]: I1205 20:48:11.387502 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" event={"ID":"d6e72054-a861-40ce-b2c9-6212896baaf4","Type":"ContainerStarted","Data":"bd97c47c46fb0e8d2a23cc4830ae7144cca1938ac450978623e6bea84b5701b9"} Dec 05 20:48:11 crc kubenswrapper[4885]: I1205 20:48:11.388283 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" event={"ID":"d6e72054-a861-40ce-b2c9-6212896baaf4","Type":"ContainerStarted","Data":"b6721eb2e9b4965200723bf4a9581febf219383bb3ab55c6914307d5993ccf90"} Dec 05 20:48:11 crc kubenswrapper[4885]: I1205 20:48:11.406731 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" podStartSLOduration=1.92374145 podStartE2EDuration="2.406711129s" podCreationTimestamp="2025-12-05 20:48:09 +0000 UTC" firstStartedPulling="2025-12-05 20:48:10.395137562 +0000 UTC m=+2555.691953213" lastFinishedPulling="2025-12-05 20:48:10.878107231 +0000 UTC m=+2556.174922892" observedRunningTime="2025-12-05 20:48:11.402451714 +0000 UTC m=+2556.699267385" watchObservedRunningTime="2025-12-05 20:48:11.406711129 +0000 UTC m=+2556.703526800" Dec 05 20:50:14 crc kubenswrapper[4885]: I1205 20:50:14.961501 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wgqmk"] Dec 05 20:50:14 crc kubenswrapper[4885]: I1205 20:50:14.965075 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:14 crc kubenswrapper[4885]: I1205 20:50:14.988161 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wgqmk"] Dec 05 20:50:15 crc kubenswrapper[4885]: I1205 20:50:15.090170 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0622ca87-6b3b-45d4-87d8-afa11a96927f-utilities\") pod \"redhat-operators-wgqmk\" (UID: \"0622ca87-6b3b-45d4-87d8-afa11a96927f\") " pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:15 crc kubenswrapper[4885]: I1205 20:50:15.090459 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0622ca87-6b3b-45d4-87d8-afa11a96927f-catalog-content\") pod \"redhat-operators-wgqmk\" (UID: \"0622ca87-6b3b-45d4-87d8-afa11a96927f\") " pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:15 crc kubenswrapper[4885]: I1205 20:50:15.090621 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpljm\" (UniqueName: \"kubernetes.io/projected/0622ca87-6b3b-45d4-87d8-afa11a96927f-kube-api-access-cpljm\") pod \"redhat-operators-wgqmk\" (UID: \"0622ca87-6b3b-45d4-87d8-afa11a96927f\") " pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:15 crc kubenswrapper[4885]: I1205 20:50:15.193016 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpljm\" (UniqueName: \"kubernetes.io/projected/0622ca87-6b3b-45d4-87d8-afa11a96927f-kube-api-access-cpljm\") pod \"redhat-operators-wgqmk\" (UID: \"0622ca87-6b3b-45d4-87d8-afa11a96927f\") " pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:15 crc kubenswrapper[4885]: I1205 20:50:15.193134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0622ca87-6b3b-45d4-87d8-afa11a96927f-utilities\") pod \"redhat-operators-wgqmk\" (UID: \"0622ca87-6b3b-45d4-87d8-afa11a96927f\") " pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:15 crc kubenswrapper[4885]: I1205 20:50:15.193170 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0622ca87-6b3b-45d4-87d8-afa11a96927f-catalog-content\") pod \"redhat-operators-wgqmk\" (UID: \"0622ca87-6b3b-45d4-87d8-afa11a96927f\") " pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:15 crc kubenswrapper[4885]: I1205 20:50:15.193574 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0622ca87-6b3b-45d4-87d8-afa11a96927f-utilities\") pod \"redhat-operators-wgqmk\" (UID: \"0622ca87-6b3b-45d4-87d8-afa11a96927f\") " pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:15 crc kubenswrapper[4885]: I1205 20:50:15.193636 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0622ca87-6b3b-45d4-87d8-afa11a96927f-catalog-content\") pod \"redhat-operators-wgqmk\" (UID: \"0622ca87-6b3b-45d4-87d8-afa11a96927f\") " pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:15 crc kubenswrapper[4885]: I1205 20:50:15.221378 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpljm\" (UniqueName: \"kubernetes.io/projected/0622ca87-6b3b-45d4-87d8-afa11a96927f-kube-api-access-cpljm\") pod \"redhat-operators-wgqmk\" (UID: \"0622ca87-6b3b-45d4-87d8-afa11a96927f\") " pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:15 crc kubenswrapper[4885]: I1205 20:50:15.299048 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:15 crc kubenswrapper[4885]: I1205 20:50:15.796991 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wgqmk"] Dec 05 20:50:16 crc kubenswrapper[4885]: I1205 20:50:16.631255 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:50:16 crc kubenswrapper[4885]: I1205 20:50:16.631555 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:50:16 crc kubenswrapper[4885]: I1205 20:50:16.692910 4885 generic.go:334] "Generic (PLEG): container finished" podID="0622ca87-6b3b-45d4-87d8-afa11a96927f" containerID="48de2d0c9f9b7e8ed07e621a4dc555ba7332cd274a933ebbaa14f76f77677807" exitCode=0 Dec 05 20:50:16 crc kubenswrapper[4885]: I1205 20:50:16.693094 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgqmk" event={"ID":"0622ca87-6b3b-45d4-87d8-afa11a96927f","Type":"ContainerDied","Data":"48de2d0c9f9b7e8ed07e621a4dc555ba7332cd274a933ebbaa14f76f77677807"} Dec 05 20:50:16 crc kubenswrapper[4885]: I1205 20:50:16.693200 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgqmk" event={"ID":"0622ca87-6b3b-45d4-87d8-afa11a96927f","Type":"ContainerStarted","Data":"226e52390cb53eef8c51e0c0682afdd5b6f63f07143ab7b6be0690719a7b188a"} Dec 05 20:50:17 crc kubenswrapper[4885]: I1205 20:50:17.705291 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgqmk" event={"ID":"0622ca87-6b3b-45d4-87d8-afa11a96927f","Type":"ContainerStarted","Data":"7329da2958b15534adf16d701276861a374649394bacc77031c87c53074c7fbd"} Dec 05 20:50:18 crc kubenswrapper[4885]: I1205 20:50:18.154077 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-css57"] Dec 05 20:50:18 crc kubenswrapper[4885]: I1205 20:50:18.157275 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:18 crc kubenswrapper[4885]: I1205 20:50:18.172608 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-css57"] Dec 05 20:50:18 crc kubenswrapper[4885]: I1205 20:50:18.250877 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7673702f-8ec3-4427-badb-00d54a3a2758-utilities\") pod \"redhat-marketplace-css57\" (UID: \"7673702f-8ec3-4427-badb-00d54a3a2758\") " pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:18 crc kubenswrapper[4885]: I1205 20:50:18.250922 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfrvq\" (UniqueName: \"kubernetes.io/projected/7673702f-8ec3-4427-badb-00d54a3a2758-kube-api-access-kfrvq\") pod \"redhat-marketplace-css57\" (UID: \"7673702f-8ec3-4427-badb-00d54a3a2758\") " pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:18 crc kubenswrapper[4885]: I1205 20:50:18.251060 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7673702f-8ec3-4427-badb-00d54a3a2758-catalog-content\") pod \"redhat-marketplace-css57\" (UID: \"7673702f-8ec3-4427-badb-00d54a3a2758\") " pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:18 crc kubenswrapper[4885]: I1205 20:50:18.352964 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7673702f-8ec3-4427-badb-00d54a3a2758-utilities\") pod \"redhat-marketplace-css57\" (UID: \"7673702f-8ec3-4427-badb-00d54a3a2758\") " pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:18 crc kubenswrapper[4885]: I1205 20:50:18.353120 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfrvq\" (UniqueName: \"kubernetes.io/projected/7673702f-8ec3-4427-badb-00d54a3a2758-kube-api-access-kfrvq\") pod \"redhat-marketplace-css57\" (UID: \"7673702f-8ec3-4427-badb-00d54a3a2758\") " pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:18 crc kubenswrapper[4885]: I1205 20:50:18.353321 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7673702f-8ec3-4427-badb-00d54a3a2758-catalog-content\") pod \"redhat-marketplace-css57\" (UID: \"7673702f-8ec3-4427-badb-00d54a3a2758\") " pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:18 crc kubenswrapper[4885]: I1205 20:50:18.353604 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7673702f-8ec3-4427-badb-00d54a3a2758-utilities\") pod \"redhat-marketplace-css57\" (UID: \"7673702f-8ec3-4427-badb-00d54a3a2758\") " pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:18 crc kubenswrapper[4885]: I1205 20:50:18.354137 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7673702f-8ec3-4427-badb-00d54a3a2758-catalog-content\") pod \"redhat-marketplace-css57\" (UID: \"7673702f-8ec3-4427-badb-00d54a3a2758\") " pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:18 crc kubenswrapper[4885]: I1205 20:50:18.384145 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfrvq\" (UniqueName: \"kubernetes.io/projected/7673702f-8ec3-4427-badb-00d54a3a2758-kube-api-access-kfrvq\") pod \"redhat-marketplace-css57\" (UID: \"7673702f-8ec3-4427-badb-00d54a3a2758\") " pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:18 crc kubenswrapper[4885]: I1205 20:50:18.501033 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:18 crc kubenswrapper[4885]: I1205 20:50:18.733164 4885 generic.go:334] "Generic (PLEG): container finished" podID="0622ca87-6b3b-45d4-87d8-afa11a96927f" containerID="7329da2958b15534adf16d701276861a374649394bacc77031c87c53074c7fbd" exitCode=0 Dec 05 20:50:18 crc kubenswrapper[4885]: I1205 20:50:18.733321 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgqmk" event={"ID":"0622ca87-6b3b-45d4-87d8-afa11a96927f","Type":"ContainerDied","Data":"7329da2958b15534adf16d701276861a374649394bacc77031c87c53074c7fbd"} Dec 05 20:50:19 crc kubenswrapper[4885]: I1205 20:50:19.039192 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-css57"] Dec 05 20:50:19 crc kubenswrapper[4885]: I1205 20:50:19.751508 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgqmk" event={"ID":"0622ca87-6b3b-45d4-87d8-afa11a96927f","Type":"ContainerStarted","Data":"ea9847e26661930121ab715280cc5d9b8df9f6733f82595d911b426ec664b5a1"} Dec 05 20:50:19 crc kubenswrapper[4885]: I1205 20:50:19.755367 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-css57" event={"ID":"7673702f-8ec3-4427-badb-00d54a3a2758","Type":"ContainerStarted","Data":"deb03ccb5f856eec17f34877a933171a500c32ad7a214993e5d1a082f87159c2"} Dec 05 20:50:19 crc kubenswrapper[4885]: I1205 20:50:19.755403 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-css57" event={"ID":"7673702f-8ec3-4427-badb-00d54a3a2758","Type":"ContainerStarted","Data":"3046e248857a2cc292440be02dc422a0e0c2dfb9ee0c2edc4bae5100d5be910a"} Dec 05 20:50:19 crc kubenswrapper[4885]: I1205 20:50:19.795222 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wgqmk" podStartSLOduration=3.255745781 podStartE2EDuration="5.795147858s" podCreationTimestamp="2025-12-05 20:50:14 +0000 UTC" firstStartedPulling="2025-12-05 20:50:16.695122415 +0000 UTC m=+2681.991938076" lastFinishedPulling="2025-12-05 20:50:19.234524492 +0000 UTC m=+2684.531340153" observedRunningTime="2025-12-05 20:50:19.77577151 +0000 UTC m=+2685.072587191" watchObservedRunningTime="2025-12-05 20:50:19.795147858 +0000 UTC m=+2685.091963529" Dec 05 20:50:20 crc kubenswrapper[4885]: I1205 20:50:20.766596 4885 generic.go:334] "Generic (PLEG): container finished" podID="7673702f-8ec3-4427-badb-00d54a3a2758" containerID="deb03ccb5f856eec17f34877a933171a500c32ad7a214993e5d1a082f87159c2" exitCode=0 Dec 05 20:50:20 crc kubenswrapper[4885]: I1205 20:50:20.766680 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-css57" event={"ID":"7673702f-8ec3-4427-badb-00d54a3a2758","Type":"ContainerDied","Data":"deb03ccb5f856eec17f34877a933171a500c32ad7a214993e5d1a082f87159c2"} Dec 05 20:50:21 crc kubenswrapper[4885]: I1205 20:50:21.779509 4885 generic.go:334] "Generic (PLEG): container finished" podID="7673702f-8ec3-4427-badb-00d54a3a2758" containerID="fb03a63f99f512fb9fceb35ad2ebe4fdaf5a64cb228b8d2c280486756fdb1946" exitCode=0 Dec 05 20:50:21 crc kubenswrapper[4885]: I1205 20:50:21.779601 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-css57" event={"ID":"7673702f-8ec3-4427-badb-00d54a3a2758","Type":"ContainerDied","Data":"fb03a63f99f512fb9fceb35ad2ebe4fdaf5a64cb228b8d2c280486756fdb1946"} Dec 05 20:50:21 crc kubenswrapper[4885]: I1205 20:50:21.781202 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:50:22 crc kubenswrapper[4885]: I1205 20:50:22.789752 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-css57" event={"ID":"7673702f-8ec3-4427-badb-00d54a3a2758","Type":"ContainerStarted","Data":"df7c13bb775332191b426f22030b548a4d95e0d588b5e6202a7a0e2d02fb7f5b"} Dec 05 20:50:25 crc kubenswrapper[4885]: I1205 20:50:25.299501 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:25 crc kubenswrapper[4885]: I1205 20:50:25.299788 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:25 crc kubenswrapper[4885]: I1205 20:50:25.341589 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:25 crc kubenswrapper[4885]: I1205 20:50:25.366923 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-css57" podStartSLOduration=5.847038648 podStartE2EDuration="7.366896654s" podCreationTimestamp="2025-12-05 20:50:18 +0000 UTC" firstStartedPulling="2025-12-05 20:50:20.769151111 +0000 UTC m=+2686.065966782" lastFinishedPulling="2025-12-05 20:50:22.289009127 +0000 UTC m=+2687.585824788" observedRunningTime="2025-12-05 20:50:23.830297514 +0000 UTC m=+2689.127113185" watchObservedRunningTime="2025-12-05 20:50:25.366896654 +0000 UTC m=+2690.663712325" Dec 05 20:50:25 crc kubenswrapper[4885]: I1205 20:50:25.894557 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:28 crc kubenswrapper[4885]: I1205 20:50:28.501227 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:28 crc kubenswrapper[4885]: I1205 20:50:28.502113 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:28 crc kubenswrapper[4885]: I1205 20:50:28.592642 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:28 crc kubenswrapper[4885]: I1205 20:50:28.901658 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:28 crc kubenswrapper[4885]: I1205 20:50:28.938522 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wgqmk"] Dec 05 20:50:28 crc kubenswrapper[4885]: I1205 20:50:28.938798 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wgqmk" podUID="0622ca87-6b3b-45d4-87d8-afa11a96927f" containerName="registry-server" containerID="cri-o://ea9847e26661930121ab715280cc5d9b8df9f6733f82595d911b426ec664b5a1" gracePeriod=2 Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.372643 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.486948 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0622ca87-6b3b-45d4-87d8-afa11a96927f-catalog-content\") pod \"0622ca87-6b3b-45d4-87d8-afa11a96927f\" (UID: \"0622ca87-6b3b-45d4-87d8-afa11a96927f\") " Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.487152 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpljm\" (UniqueName: \"kubernetes.io/projected/0622ca87-6b3b-45d4-87d8-afa11a96927f-kube-api-access-cpljm\") pod \"0622ca87-6b3b-45d4-87d8-afa11a96927f\" (UID: \"0622ca87-6b3b-45d4-87d8-afa11a96927f\") " Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.487285 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0622ca87-6b3b-45d4-87d8-afa11a96927f-utilities\") pod \"0622ca87-6b3b-45d4-87d8-afa11a96927f\" (UID: \"0622ca87-6b3b-45d4-87d8-afa11a96927f\") " Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.488393 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0622ca87-6b3b-45d4-87d8-afa11a96927f-utilities" (OuterVolumeSpecName: "utilities") pod "0622ca87-6b3b-45d4-87d8-afa11a96927f" (UID: "0622ca87-6b3b-45d4-87d8-afa11a96927f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.497523 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0622ca87-6b3b-45d4-87d8-afa11a96927f-kube-api-access-cpljm" (OuterVolumeSpecName: "kube-api-access-cpljm") pod "0622ca87-6b3b-45d4-87d8-afa11a96927f" (UID: "0622ca87-6b3b-45d4-87d8-afa11a96927f"). InnerVolumeSpecName "kube-api-access-cpljm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.590327 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0622ca87-6b3b-45d4-87d8-afa11a96927f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.590380 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpljm\" (UniqueName: \"kubernetes.io/projected/0622ca87-6b3b-45d4-87d8-afa11a96927f-kube-api-access-cpljm\") on node \"crc\" DevicePath \"\"" Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.592525 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0622ca87-6b3b-45d4-87d8-afa11a96927f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0622ca87-6b3b-45d4-87d8-afa11a96927f" (UID: "0622ca87-6b3b-45d4-87d8-afa11a96927f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.691846 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0622ca87-6b3b-45d4-87d8-afa11a96927f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.870430 4885 generic.go:334] "Generic (PLEG): container finished" podID="0622ca87-6b3b-45d4-87d8-afa11a96927f" containerID="ea9847e26661930121ab715280cc5d9b8df9f6733f82595d911b426ec664b5a1" exitCode=0 Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.870479 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgqmk" event={"ID":"0622ca87-6b3b-45d4-87d8-afa11a96927f","Type":"ContainerDied","Data":"ea9847e26661930121ab715280cc5d9b8df9f6733f82595d911b426ec664b5a1"} Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.870534 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgqmk" Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.870846 4885 scope.go:117] "RemoveContainer" containerID="ea9847e26661930121ab715280cc5d9b8df9f6733f82595d911b426ec664b5a1" Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.870825 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgqmk" event={"ID":"0622ca87-6b3b-45d4-87d8-afa11a96927f","Type":"ContainerDied","Data":"226e52390cb53eef8c51e0c0682afdd5b6f63f07143ab7b6be0690719a7b188a"} Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.903849 4885 scope.go:117] "RemoveContainer" containerID="7329da2958b15534adf16d701276861a374649394bacc77031c87c53074c7fbd" Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.910476 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wgqmk"] Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.919116 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wgqmk"] Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.936289 4885 scope.go:117] "RemoveContainer" containerID="48de2d0c9f9b7e8ed07e621a4dc555ba7332cd274a933ebbaa14f76f77677807" Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.977384 4885 scope.go:117] "RemoveContainer" containerID="ea9847e26661930121ab715280cc5d9b8df9f6733f82595d911b426ec664b5a1" Dec 05 20:50:29 crc kubenswrapper[4885]: E1205 20:50:29.978568 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea9847e26661930121ab715280cc5d9b8df9f6733f82595d911b426ec664b5a1\": container with ID starting with ea9847e26661930121ab715280cc5d9b8df9f6733f82595d911b426ec664b5a1 not found: ID does not exist" containerID="ea9847e26661930121ab715280cc5d9b8df9f6733f82595d911b426ec664b5a1" Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.978622 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea9847e26661930121ab715280cc5d9b8df9f6733f82595d911b426ec664b5a1"} err="failed to get container status \"ea9847e26661930121ab715280cc5d9b8df9f6733f82595d911b426ec664b5a1\": rpc error: code = NotFound desc = could not find container \"ea9847e26661930121ab715280cc5d9b8df9f6733f82595d911b426ec664b5a1\": container with ID starting with ea9847e26661930121ab715280cc5d9b8df9f6733f82595d911b426ec664b5a1 not found: ID does not exist" Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.978657 4885 scope.go:117] "RemoveContainer" containerID="7329da2958b15534adf16d701276861a374649394bacc77031c87c53074c7fbd" Dec 05 20:50:29 crc kubenswrapper[4885]: E1205 20:50:29.979354 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7329da2958b15534adf16d701276861a374649394bacc77031c87c53074c7fbd\": container with ID starting with 7329da2958b15534adf16d701276861a374649394bacc77031c87c53074c7fbd not found: ID does not exist" containerID="7329da2958b15534adf16d701276861a374649394bacc77031c87c53074c7fbd" Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.979421 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7329da2958b15534adf16d701276861a374649394bacc77031c87c53074c7fbd"} err="failed to get container status \"7329da2958b15534adf16d701276861a374649394bacc77031c87c53074c7fbd\": rpc error: code = NotFound desc = could not find container \"7329da2958b15534adf16d701276861a374649394bacc77031c87c53074c7fbd\": container with ID starting with 7329da2958b15534adf16d701276861a374649394bacc77031c87c53074c7fbd not found: ID does not exist" Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.979467 4885 scope.go:117] "RemoveContainer" containerID="48de2d0c9f9b7e8ed07e621a4dc555ba7332cd274a933ebbaa14f76f77677807" Dec 05 20:50:29 crc kubenswrapper[4885]: E1205 20:50:29.979809 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48de2d0c9f9b7e8ed07e621a4dc555ba7332cd274a933ebbaa14f76f77677807\": container with ID starting with 48de2d0c9f9b7e8ed07e621a4dc555ba7332cd274a933ebbaa14f76f77677807 not found: ID does not exist" containerID="48de2d0c9f9b7e8ed07e621a4dc555ba7332cd274a933ebbaa14f76f77677807" Dec 05 20:50:29 crc kubenswrapper[4885]: I1205 20:50:29.979845 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48de2d0c9f9b7e8ed07e621a4dc555ba7332cd274a933ebbaa14f76f77677807"} err="failed to get container status \"48de2d0c9f9b7e8ed07e621a4dc555ba7332cd274a933ebbaa14f76f77677807\": rpc error: code = NotFound desc = could not find container \"48de2d0c9f9b7e8ed07e621a4dc555ba7332cd274a933ebbaa14f76f77677807\": container with ID starting with 48de2d0c9f9b7e8ed07e621a4dc555ba7332cd274a933ebbaa14f76f77677807 not found: ID does not exist" Dec 05 20:50:31 crc kubenswrapper[4885]: I1205 20:50:31.203564 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0622ca87-6b3b-45d4-87d8-afa11a96927f" path="/var/lib/kubelet/pods/0622ca87-6b3b-45d4-87d8-afa11a96927f/volumes" Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.338681 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-css57"] Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.338978 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-css57" podUID="7673702f-8ec3-4427-badb-00d54a3a2758" containerName="registry-server" containerID="cri-o://df7c13bb775332191b426f22030b548a4d95e0d588b5e6202a7a0e2d02fb7f5b" gracePeriod=2 Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.803906 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.875710 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfrvq\" (UniqueName: \"kubernetes.io/projected/7673702f-8ec3-4427-badb-00d54a3a2758-kube-api-access-kfrvq\") pod \"7673702f-8ec3-4427-badb-00d54a3a2758\" (UID: \"7673702f-8ec3-4427-badb-00d54a3a2758\") " Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.875827 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7673702f-8ec3-4427-badb-00d54a3a2758-utilities\") pod \"7673702f-8ec3-4427-badb-00d54a3a2758\" (UID: \"7673702f-8ec3-4427-badb-00d54a3a2758\") " Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.876013 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7673702f-8ec3-4427-badb-00d54a3a2758-catalog-content\") pod \"7673702f-8ec3-4427-badb-00d54a3a2758\" (UID: \"7673702f-8ec3-4427-badb-00d54a3a2758\") " Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.876610 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7673702f-8ec3-4427-badb-00d54a3a2758-utilities" (OuterVolumeSpecName: "utilities") pod "7673702f-8ec3-4427-badb-00d54a3a2758" (UID: "7673702f-8ec3-4427-badb-00d54a3a2758"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.881369 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7673702f-8ec3-4427-badb-00d54a3a2758-kube-api-access-kfrvq" (OuterVolumeSpecName: "kube-api-access-kfrvq") pod "7673702f-8ec3-4427-badb-00d54a3a2758" (UID: "7673702f-8ec3-4427-badb-00d54a3a2758"). InnerVolumeSpecName "kube-api-access-kfrvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.901166 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7673702f-8ec3-4427-badb-00d54a3a2758-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7673702f-8ec3-4427-badb-00d54a3a2758" (UID: "7673702f-8ec3-4427-badb-00d54a3a2758"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.926845 4885 generic.go:334] "Generic (PLEG): container finished" podID="7673702f-8ec3-4427-badb-00d54a3a2758" containerID="df7c13bb775332191b426f22030b548a4d95e0d588b5e6202a7a0e2d02fb7f5b" exitCode=0 Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.926892 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-css57" event={"ID":"7673702f-8ec3-4427-badb-00d54a3a2758","Type":"ContainerDied","Data":"df7c13bb775332191b426f22030b548a4d95e0d588b5e6202a7a0e2d02fb7f5b"} Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.926923 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-css57" event={"ID":"7673702f-8ec3-4427-badb-00d54a3a2758","Type":"ContainerDied","Data":"3046e248857a2cc292440be02dc422a0e0c2dfb9ee0c2edc4bae5100d5be910a"} Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.926930 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-css57" Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.926940 4885 scope.go:117] "RemoveContainer" containerID="df7c13bb775332191b426f22030b548a4d95e0d588b5e6202a7a0e2d02fb7f5b" Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.956940 4885 scope.go:117] "RemoveContainer" containerID="fb03a63f99f512fb9fceb35ad2ebe4fdaf5a64cb228b8d2c280486756fdb1946" Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.963189 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-css57"] Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.973076 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-css57"] Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.977758 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7673702f-8ec3-4427-badb-00d54a3a2758-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.977808 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfrvq\" (UniqueName: \"kubernetes.io/projected/7673702f-8ec3-4427-badb-00d54a3a2758-kube-api-access-kfrvq\") on node \"crc\" DevicePath \"\"" Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.977820 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7673702f-8ec3-4427-badb-00d54a3a2758-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:50:33 crc kubenswrapper[4885]: I1205 20:50:33.980456 4885 scope.go:117] "RemoveContainer" containerID="deb03ccb5f856eec17f34877a933171a500c32ad7a214993e5d1a082f87159c2" Dec 05 20:50:34 crc kubenswrapper[4885]: I1205 20:50:34.043003 4885 scope.go:117] "RemoveContainer" containerID="df7c13bb775332191b426f22030b548a4d95e0d588b5e6202a7a0e2d02fb7f5b" Dec 05 20:50:34 crc kubenswrapper[4885]: E1205 20:50:34.043720 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df7c13bb775332191b426f22030b548a4d95e0d588b5e6202a7a0e2d02fb7f5b\": container with ID starting with df7c13bb775332191b426f22030b548a4d95e0d588b5e6202a7a0e2d02fb7f5b not found: ID does not exist" containerID="df7c13bb775332191b426f22030b548a4d95e0d588b5e6202a7a0e2d02fb7f5b" Dec 05 20:50:34 crc kubenswrapper[4885]: I1205 20:50:34.043773 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df7c13bb775332191b426f22030b548a4d95e0d588b5e6202a7a0e2d02fb7f5b"} err="failed to get container status \"df7c13bb775332191b426f22030b548a4d95e0d588b5e6202a7a0e2d02fb7f5b\": rpc error: code = NotFound desc = could not find container \"df7c13bb775332191b426f22030b548a4d95e0d588b5e6202a7a0e2d02fb7f5b\": container with ID starting with df7c13bb775332191b426f22030b548a4d95e0d588b5e6202a7a0e2d02fb7f5b not found: ID does not exist" Dec 05 20:50:34 crc kubenswrapper[4885]: I1205 20:50:34.043799 4885 scope.go:117] "RemoveContainer" containerID="fb03a63f99f512fb9fceb35ad2ebe4fdaf5a64cb228b8d2c280486756fdb1946" Dec 05 20:50:34 crc kubenswrapper[4885]: E1205 20:50:34.044408 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb03a63f99f512fb9fceb35ad2ebe4fdaf5a64cb228b8d2c280486756fdb1946\": container with ID starting with fb03a63f99f512fb9fceb35ad2ebe4fdaf5a64cb228b8d2c280486756fdb1946 not found: ID does not exist" containerID="fb03a63f99f512fb9fceb35ad2ebe4fdaf5a64cb228b8d2c280486756fdb1946" Dec 05 20:50:34 crc kubenswrapper[4885]: I1205 20:50:34.044448 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb03a63f99f512fb9fceb35ad2ebe4fdaf5a64cb228b8d2c280486756fdb1946"} err="failed to get container status \"fb03a63f99f512fb9fceb35ad2ebe4fdaf5a64cb228b8d2c280486756fdb1946\": rpc error: code = NotFound desc = could not find container \"fb03a63f99f512fb9fceb35ad2ebe4fdaf5a64cb228b8d2c280486756fdb1946\": container with ID starting with fb03a63f99f512fb9fceb35ad2ebe4fdaf5a64cb228b8d2c280486756fdb1946 not found: ID does not exist" Dec 05 20:50:34 crc kubenswrapper[4885]: I1205 20:50:34.044481 4885 scope.go:117] "RemoveContainer" containerID="deb03ccb5f856eec17f34877a933171a500c32ad7a214993e5d1a082f87159c2" Dec 05 20:50:34 crc kubenswrapper[4885]: E1205 20:50:34.044759 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb03ccb5f856eec17f34877a933171a500c32ad7a214993e5d1a082f87159c2\": container with ID starting with deb03ccb5f856eec17f34877a933171a500c32ad7a214993e5d1a082f87159c2 not found: ID does not exist" containerID="deb03ccb5f856eec17f34877a933171a500c32ad7a214993e5d1a082f87159c2" Dec 05 20:50:34 crc kubenswrapper[4885]: I1205 20:50:34.044786 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb03ccb5f856eec17f34877a933171a500c32ad7a214993e5d1a082f87159c2"} err="failed to get container status \"deb03ccb5f856eec17f34877a933171a500c32ad7a214993e5d1a082f87159c2\": rpc error: code = NotFound desc = could not find container \"deb03ccb5f856eec17f34877a933171a500c32ad7a214993e5d1a082f87159c2\": container with ID starting with deb03ccb5f856eec17f34877a933171a500c32ad7a214993e5d1a082f87159c2 not found: ID does not exist" Dec 05 20:50:35 crc kubenswrapper[4885]: I1205 20:50:35.189396 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7673702f-8ec3-4427-badb-00d54a3a2758" path="/var/lib/kubelet/pods/7673702f-8ec3-4427-badb-00d54a3a2758/volumes" Dec 05 20:50:37 crc kubenswrapper[4885]: I1205 20:50:37.970804 4885 generic.go:334] "Generic (PLEG): container finished" podID="d6e72054-a861-40ce-b2c9-6212896baaf4" containerID="bd97c47c46fb0e8d2a23cc4830ae7144cca1938ac450978623e6bea84b5701b9" exitCode=0 Dec 05 20:50:37 crc kubenswrapper[4885]: I1205 20:50:37.971013 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" event={"ID":"d6e72054-a861-40ce-b2c9-6212896baaf4","Type":"ContainerDied","Data":"bd97c47c46fb0e8d2a23cc4830ae7144cca1938ac450978623e6bea84b5701b9"} Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.369196 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.504553 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mprbn\" (UniqueName: \"kubernetes.io/projected/d6e72054-a861-40ce-b2c9-6212896baaf4-kube-api-access-mprbn\") pod \"d6e72054-a861-40ce-b2c9-6212896baaf4\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.504642 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ssh-key\") pod \"d6e72054-a861-40ce-b2c9-6212896baaf4\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.504726 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-1\") pod \"d6e72054-a861-40ce-b2c9-6212896baaf4\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.504793 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-telemetry-combined-ca-bundle\") pod \"d6e72054-a861-40ce-b2c9-6212896baaf4\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.504853 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-2\") pod \"d6e72054-a861-40ce-b2c9-6212896baaf4\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.504904 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-0\") pod \"d6e72054-a861-40ce-b2c9-6212896baaf4\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.504969 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-inventory\") pod \"d6e72054-a861-40ce-b2c9-6212896baaf4\" (UID: \"d6e72054-a861-40ce-b2c9-6212896baaf4\") " Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.510226 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d6e72054-a861-40ce-b2c9-6212896baaf4" (UID: "d6e72054-a861-40ce-b2c9-6212896baaf4"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.510475 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e72054-a861-40ce-b2c9-6212896baaf4-kube-api-access-mprbn" (OuterVolumeSpecName: "kube-api-access-mprbn") pod "d6e72054-a861-40ce-b2c9-6212896baaf4" (UID: "d6e72054-a861-40ce-b2c9-6212896baaf4"). InnerVolumeSpecName "kube-api-access-mprbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.535326 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-inventory" (OuterVolumeSpecName: "inventory") pod "d6e72054-a861-40ce-b2c9-6212896baaf4" (UID: "d6e72054-a861-40ce-b2c9-6212896baaf4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.544119 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d6e72054-a861-40ce-b2c9-6212896baaf4" (UID: "d6e72054-a861-40ce-b2c9-6212896baaf4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.550976 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "d6e72054-a861-40ce-b2c9-6212896baaf4" (UID: "d6e72054-a861-40ce-b2c9-6212896baaf4"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.553206 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "d6e72054-a861-40ce-b2c9-6212896baaf4" (UID: "d6e72054-a861-40ce-b2c9-6212896baaf4"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.558407 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "d6e72054-a861-40ce-b2c9-6212896baaf4" (UID: "d6e72054-a861-40ce-b2c9-6212896baaf4"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.606909 4885 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.606944 4885 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.606958 4885 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.606971 4885 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.606985 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.606998 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mprbn\" (UniqueName: \"kubernetes.io/projected/d6e72054-a861-40ce-b2c9-6212896baaf4-kube-api-access-mprbn\") on node \"crc\" DevicePath \"\"" Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.607010 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6e72054-a861-40ce-b2c9-6212896baaf4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.997086 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" event={"ID":"d6e72054-a861-40ce-b2c9-6212896baaf4","Type":"ContainerDied","Data":"b6721eb2e9b4965200723bf4a9581febf219383bb3ab55c6914307d5993ccf90"} Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.997126 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6721eb2e9b4965200723bf4a9581febf219383bb3ab55c6914307d5993ccf90" Dec 05 20:50:39 crc kubenswrapper[4885]: I1205 20:50:39.997290 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5m26m" Dec 05 20:50:46 crc kubenswrapper[4885]: I1205 20:50:46.631614 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:50:46 crc kubenswrapper[4885]: I1205 20:50:46.632146 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:51:16 crc kubenswrapper[4885]: I1205 20:51:16.630668 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:51:16 crc kubenswrapper[4885]: I1205 20:51:16.631194 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:51:16 crc kubenswrapper[4885]: I1205 20:51:16.631246 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:51:16 crc kubenswrapper[4885]: I1205 20:51:16.632003 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5472b896165a120797b4837756fc5f6fc90406538f96f516b4ccfa0b788d4fb5"} pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:51:16 crc kubenswrapper[4885]: I1205 20:51:16.632093 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" containerID="cri-o://5472b896165a120797b4837756fc5f6fc90406538f96f516b4ccfa0b788d4fb5" gracePeriod=600 Dec 05 20:51:17 crc kubenswrapper[4885]: I1205 20:51:17.357140 4885 generic.go:334] "Generic (PLEG): container finished" podID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerID="5472b896165a120797b4837756fc5f6fc90406538f96f516b4ccfa0b788d4fb5" exitCode=0 Dec 05 20:51:17 crc kubenswrapper[4885]: I1205 20:51:17.357223 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerDied","Data":"5472b896165a120797b4837756fc5f6fc90406538f96f516b4ccfa0b788d4fb5"} Dec 05 20:51:17 crc kubenswrapper[4885]: I1205 20:51:17.357716 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d"} Dec 05 20:51:17 crc kubenswrapper[4885]: I1205 20:51:17.357739 4885 scope.go:117] "RemoveContainer" containerID="390506d030e26bef5b9d4cc9367a2e58963e75949714b911ae9b81d5347b9ba3" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.437621 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 20:51:20 crc kubenswrapper[4885]: E1205 20:51:20.438485 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7673702f-8ec3-4427-badb-00d54a3a2758" containerName="registry-server" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.438500 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7673702f-8ec3-4427-badb-00d54a3a2758" containerName="registry-server" Dec 05 20:51:20 crc kubenswrapper[4885]: E1205 20:51:20.438526 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e72054-a861-40ce-b2c9-6212896baaf4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.438534 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e72054-a861-40ce-b2c9-6212896baaf4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 20:51:20 crc kubenswrapper[4885]: E1205 20:51:20.438553 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0622ca87-6b3b-45d4-87d8-afa11a96927f" containerName="extract-content" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.438559 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0622ca87-6b3b-45d4-87d8-afa11a96927f" containerName="extract-content" Dec 05 20:51:20 crc kubenswrapper[4885]: E1205 20:51:20.438569 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0622ca87-6b3b-45d4-87d8-afa11a96927f" containerName="registry-server" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.438575 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0622ca87-6b3b-45d4-87d8-afa11a96927f" containerName="registry-server" Dec 05 20:51:20 crc kubenswrapper[4885]: E1205 20:51:20.438589 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0622ca87-6b3b-45d4-87d8-afa11a96927f" containerName="extract-utilities" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.438594 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0622ca87-6b3b-45d4-87d8-afa11a96927f" containerName="extract-utilities" Dec 05 20:51:20 crc kubenswrapper[4885]: E1205 20:51:20.438605 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7673702f-8ec3-4427-badb-00d54a3a2758" containerName="extract-utilities" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.438611 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7673702f-8ec3-4427-badb-00d54a3a2758" containerName="extract-utilities" Dec 05 20:51:20 crc kubenswrapper[4885]: E1205 20:51:20.438622 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7673702f-8ec3-4427-badb-00d54a3a2758" containerName="extract-content" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.438628 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7673702f-8ec3-4427-badb-00d54a3a2758" containerName="extract-content" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.438790 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7673702f-8ec3-4427-badb-00d54a3a2758" containerName="registry-server" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.438806 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0622ca87-6b3b-45d4-87d8-afa11a96927f" containerName="registry-server" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.438828 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e72054-a861-40ce-b2c9-6212896baaf4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.439424 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.441579 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.442106 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-fl7jz" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.445658 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.445872 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.451695 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.451790 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.451842 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-config-data\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.469661 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.554001 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-config-data\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.554090 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.554143 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.554244 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4c49\" (UniqueName: \"kubernetes.io/projected/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-kube-api-access-c4c49\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.554294 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.554347 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.554382 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.554451 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.554479 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.555435 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.555437 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-config-data\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.563331 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.656143 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.656523 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.656609 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.656623 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.656667 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.656736 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4c49\" (UniqueName: \"kubernetes.io/projected/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-kube-api-access-c4c49\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.656768 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.657119 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.657631 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.667363 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.667794 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.673083 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4c49\" (UniqueName: \"kubernetes.io/projected/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-kube-api-access-c4c49\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.697300 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " pod="openstack/tempest-tests-tempest" Dec 05 20:51:20 crc kubenswrapper[4885]: I1205 20:51:20.775307 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 20:51:21 crc kubenswrapper[4885]: I1205 20:51:21.276019 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 20:51:21 crc kubenswrapper[4885]: W1205 20:51:21.282986 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f679f95_52b0_4cdd_a9f2_f7dcd5f23d2d.slice/crio-1af558ce4ec9c628cda0d4cbad542e5f4ab020803ffb0bd6464e66e9c5662683 WatchSource:0}: Error finding container 1af558ce4ec9c628cda0d4cbad542e5f4ab020803ffb0bd6464e66e9c5662683: Status 404 returned error can't find the container with id 1af558ce4ec9c628cda0d4cbad542e5f4ab020803ffb0bd6464e66e9c5662683 Dec 05 20:51:21 crc kubenswrapper[4885]: I1205 20:51:21.398769 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d","Type":"ContainerStarted","Data":"1af558ce4ec9c628cda0d4cbad542e5f4ab020803ffb0bd6464e66e9c5662683"} Dec 05 20:51:54 crc kubenswrapper[4885]: E1205 20:51:54.355896 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 05 20:51:54 crc kubenswrapper[4885]: E1205 20:51:54.356731 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c4c49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:51:54 crc kubenswrapper[4885]: E1205 20:51:54.357958 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d" Dec 05 20:51:54 crc kubenswrapper[4885]: E1205 20:51:54.742385 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d" Dec 05 20:52:09 crc kubenswrapper[4885]: I1205 20:52:09.677666 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 05 20:52:10 crc kubenswrapper[4885]: I1205 20:52:10.906495 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d","Type":"ContainerStarted","Data":"09416ae515902e13b08a2249a3254576caeadbb53906043e24db61143a5e86e4"} Dec 05 20:52:10 crc kubenswrapper[4885]: I1205 20:52:10.934835 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.544426211 podStartE2EDuration="51.934805799s" podCreationTimestamp="2025-12-05 20:51:19 +0000 UTC" firstStartedPulling="2025-12-05 20:51:21.285312594 +0000 UTC m=+2746.582128265" lastFinishedPulling="2025-12-05 20:52:09.675692192 +0000 UTC m=+2794.972507853" observedRunningTime="2025-12-05 20:52:10.93008496 +0000 UTC m=+2796.226900621" watchObservedRunningTime="2025-12-05 20:52:10.934805799 +0000 UTC m=+2796.231621470" Dec 05 20:53:16 crc kubenswrapper[4885]: I1205 20:53:16.631575 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:53:16 crc kubenswrapper[4885]: I1205 20:53:16.632181 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:53:46 crc kubenswrapper[4885]: I1205 20:53:46.631068 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:53:46 crc kubenswrapper[4885]: I1205 20:53:46.631684 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:54:16 crc kubenswrapper[4885]: I1205 20:54:16.631432 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:54:16 crc kubenswrapper[4885]: I1205 20:54:16.631994 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:54:16 crc kubenswrapper[4885]: I1205 20:54:16.632061 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 20:54:16 crc kubenswrapper[4885]: I1205 20:54:16.632845 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d"} pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:54:16 crc kubenswrapper[4885]: I1205 20:54:16.632903 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" containerID="cri-o://b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" gracePeriod=600 Dec 05 20:54:16 crc kubenswrapper[4885]: E1205 20:54:16.823045 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:54:17 crc kubenswrapper[4885]: I1205 20:54:17.199232 4885 generic.go:334] "Generic (PLEG): container finished" podID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" exitCode=0 Dec 05 20:54:17 crc kubenswrapper[4885]: I1205 20:54:17.203259 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerDied","Data":"b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d"} Dec 05 20:54:17 crc kubenswrapper[4885]: I1205 20:54:17.203308 4885 scope.go:117] "RemoveContainer" containerID="5472b896165a120797b4837756fc5f6fc90406538f96f516b4ccfa0b788d4fb5" Dec 05 20:54:17 crc kubenswrapper[4885]: I1205 20:54:17.203936 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:54:17 crc kubenswrapper[4885]: E1205 20:54:17.204218 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:54:31 crc kubenswrapper[4885]: I1205 20:54:31.173377 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:54:31 crc kubenswrapper[4885]: E1205 20:54:31.174309 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:54:46 crc kubenswrapper[4885]: I1205 20:54:46.172641 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:54:46 crc kubenswrapper[4885]: E1205 20:54:46.173375 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:55:00 crc kubenswrapper[4885]: I1205 20:55:00.172316 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:55:00 crc kubenswrapper[4885]: E1205 20:55:00.173194 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:55:15 crc kubenswrapper[4885]: I1205 20:55:15.179705 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:55:15 crc kubenswrapper[4885]: E1205 20:55:15.180712 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:55:27 crc kubenswrapper[4885]: I1205 20:55:27.173349 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:55:27 crc kubenswrapper[4885]: E1205 20:55:27.174538 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:55:39 crc kubenswrapper[4885]: I1205 20:55:39.172734 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:55:39 crc kubenswrapper[4885]: E1205 20:55:39.173651 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:55:50 crc kubenswrapper[4885]: I1205 20:55:50.172829 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:55:50 crc kubenswrapper[4885]: E1205 20:55:50.173516 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:56:05 crc kubenswrapper[4885]: I1205 20:56:05.179470 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:56:05 crc kubenswrapper[4885]: E1205 20:56:05.180807 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:56:20 crc kubenswrapper[4885]: I1205 20:56:20.172825 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:56:20 crc kubenswrapper[4885]: E1205 20:56:20.173424 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:56:34 crc kubenswrapper[4885]: I1205 20:56:34.174249 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:56:34 crc kubenswrapper[4885]: E1205 20:56:34.175496 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:56:49 crc kubenswrapper[4885]: I1205 20:56:49.173151 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:56:49 crc kubenswrapper[4885]: E1205 20:56:49.174421 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:57:04 crc kubenswrapper[4885]: I1205 20:57:04.173783 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:57:04 crc kubenswrapper[4885]: E1205 20:57:04.174794 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:57:18 crc kubenswrapper[4885]: I1205 20:57:18.173468 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:57:18 crc kubenswrapper[4885]: E1205 20:57:18.174634 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:57:29 crc kubenswrapper[4885]: I1205 20:57:29.173379 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:57:29 crc kubenswrapper[4885]: E1205 20:57:29.177375 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:57:36 crc kubenswrapper[4885]: I1205 20:57:36.998924 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vqhld"] Dec 05 20:57:37 crc kubenswrapper[4885]: I1205 20:57:37.001895 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:37 crc kubenswrapper[4885]: I1205 20:57:37.013249 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqhld"] Dec 05 20:57:37 crc kubenswrapper[4885]: I1205 20:57:37.142832 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76jp9\" (UniqueName: \"kubernetes.io/projected/446689ad-06c9-4bed-8695-97d52ae46e15-kube-api-access-76jp9\") pod \"community-operators-vqhld\" (UID: \"446689ad-06c9-4bed-8695-97d52ae46e15\") " pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:37 crc kubenswrapper[4885]: I1205 20:57:37.143120 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446689ad-06c9-4bed-8695-97d52ae46e15-utilities\") pod \"community-operators-vqhld\" (UID: \"446689ad-06c9-4bed-8695-97d52ae46e15\") " pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:37 crc kubenswrapper[4885]: I1205 20:57:37.143472 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446689ad-06c9-4bed-8695-97d52ae46e15-catalog-content\") pod \"community-operators-vqhld\" (UID: \"446689ad-06c9-4bed-8695-97d52ae46e15\") " pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:37 crc kubenswrapper[4885]: I1205 20:57:37.244966 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446689ad-06c9-4bed-8695-97d52ae46e15-catalog-content\") pod \"community-operators-vqhld\" (UID: \"446689ad-06c9-4bed-8695-97d52ae46e15\") " pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:37 crc kubenswrapper[4885]: I1205 20:57:37.245106 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76jp9\" (UniqueName: \"kubernetes.io/projected/446689ad-06c9-4bed-8695-97d52ae46e15-kube-api-access-76jp9\") pod \"community-operators-vqhld\" (UID: \"446689ad-06c9-4bed-8695-97d52ae46e15\") " pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:37 crc kubenswrapper[4885]: I1205 20:57:37.245563 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446689ad-06c9-4bed-8695-97d52ae46e15-catalog-content\") pod \"community-operators-vqhld\" (UID: \"446689ad-06c9-4bed-8695-97d52ae46e15\") " pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:37 crc kubenswrapper[4885]: I1205 20:57:37.245634 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446689ad-06c9-4bed-8695-97d52ae46e15-utilities\") pod \"community-operators-vqhld\" (UID: \"446689ad-06c9-4bed-8695-97d52ae46e15\") " pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:37 crc kubenswrapper[4885]: I1205 20:57:37.246151 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446689ad-06c9-4bed-8695-97d52ae46e15-utilities\") pod \"community-operators-vqhld\" (UID: \"446689ad-06c9-4bed-8695-97d52ae46e15\") " pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:37 crc kubenswrapper[4885]: I1205 20:57:37.266847 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76jp9\" (UniqueName: \"kubernetes.io/projected/446689ad-06c9-4bed-8695-97d52ae46e15-kube-api-access-76jp9\") pod \"community-operators-vqhld\" (UID: \"446689ad-06c9-4bed-8695-97d52ae46e15\") " pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:37 crc kubenswrapper[4885]: I1205 20:57:37.342850 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:37 crc kubenswrapper[4885]: I1205 20:57:37.873036 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqhld"] Dec 05 20:57:37 crc kubenswrapper[4885]: I1205 20:57:37.910977 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqhld" event={"ID":"446689ad-06c9-4bed-8695-97d52ae46e15","Type":"ContainerStarted","Data":"e755160f1d77596105d44315bbb16668713485858d320aa531f9534acdac5c83"} Dec 05 20:57:38 crc kubenswrapper[4885]: I1205 20:57:38.924852 4885 generic.go:334] "Generic (PLEG): container finished" podID="446689ad-06c9-4bed-8695-97d52ae46e15" containerID="f4b691fa2c9d802c8be8795dcaaca94852a2b249c4b34f0e4af9779882dc27e5" exitCode=0 Dec 05 20:57:38 crc kubenswrapper[4885]: I1205 20:57:38.924967 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqhld" event={"ID":"446689ad-06c9-4bed-8695-97d52ae46e15","Type":"ContainerDied","Data":"f4b691fa2c9d802c8be8795dcaaca94852a2b249c4b34f0e4af9779882dc27e5"} Dec 05 20:57:38 crc kubenswrapper[4885]: I1205 20:57:38.927779 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:57:39 crc kubenswrapper[4885]: I1205 20:57:39.935267 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqhld" event={"ID":"446689ad-06c9-4bed-8695-97d52ae46e15","Type":"ContainerStarted","Data":"74dad162b8994c5d97b8cbc4e2a8452d07a28b95d0610d9c5001345e08b6fd9b"} Dec 05 20:57:40 crc kubenswrapper[4885]: I1205 20:57:40.951659 4885 generic.go:334] "Generic (PLEG): container finished" podID="446689ad-06c9-4bed-8695-97d52ae46e15" containerID="74dad162b8994c5d97b8cbc4e2a8452d07a28b95d0610d9c5001345e08b6fd9b" exitCode=0 Dec 05 20:57:40 crc kubenswrapper[4885]: I1205 20:57:40.951740 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqhld" event={"ID":"446689ad-06c9-4bed-8695-97d52ae46e15","Type":"ContainerDied","Data":"74dad162b8994c5d97b8cbc4e2a8452d07a28b95d0610d9c5001345e08b6fd9b"} Dec 05 20:57:41 crc kubenswrapper[4885]: I1205 20:57:41.173882 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:57:41 crc kubenswrapper[4885]: E1205 20:57:41.174480 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:57:42 crc kubenswrapper[4885]: I1205 20:57:42.976774 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqhld" event={"ID":"446689ad-06c9-4bed-8695-97d52ae46e15","Type":"ContainerStarted","Data":"3d4cc26c07027c26eb29f69cb7e019dc9a91a0bba2ce7497ab8adbf80fd09684"} Dec 05 20:57:43 crc kubenswrapper[4885]: I1205 20:57:43.005066 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vqhld" podStartSLOduration=3.526095443 podStartE2EDuration="7.005043628s" podCreationTimestamp="2025-12-05 20:57:36 +0000 UTC" firstStartedPulling="2025-12-05 20:57:38.927549119 +0000 UTC m=+3124.224364780" lastFinishedPulling="2025-12-05 20:57:42.406497304 +0000 UTC m=+3127.703312965" observedRunningTime="2025-12-05 20:57:42.996381446 +0000 UTC m=+3128.293197107" watchObservedRunningTime="2025-12-05 20:57:43.005043628 +0000 UTC m=+3128.301859289" Dec 05 20:57:47 crc kubenswrapper[4885]: I1205 20:57:47.344040 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:47 crc kubenswrapper[4885]: I1205 20:57:47.344544 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:47 crc kubenswrapper[4885]: I1205 20:57:47.404809 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:48 crc kubenswrapper[4885]: I1205 20:57:48.074141 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:48 crc kubenswrapper[4885]: I1205 20:57:48.133485 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqhld"] Dec 05 20:57:50 crc kubenswrapper[4885]: I1205 20:57:50.048110 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vqhld" podUID="446689ad-06c9-4bed-8695-97d52ae46e15" containerName="registry-server" containerID="cri-o://3d4cc26c07027c26eb29f69cb7e019dc9a91a0bba2ce7497ab8adbf80fd09684" gracePeriod=2 Dec 05 20:57:50 crc kubenswrapper[4885]: I1205 20:57:50.644511 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:50 crc kubenswrapper[4885]: I1205 20:57:50.836265 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76jp9\" (UniqueName: \"kubernetes.io/projected/446689ad-06c9-4bed-8695-97d52ae46e15-kube-api-access-76jp9\") pod \"446689ad-06c9-4bed-8695-97d52ae46e15\" (UID: \"446689ad-06c9-4bed-8695-97d52ae46e15\") " Dec 05 20:57:50 crc kubenswrapper[4885]: I1205 20:57:50.836381 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446689ad-06c9-4bed-8695-97d52ae46e15-utilities\") pod \"446689ad-06c9-4bed-8695-97d52ae46e15\" (UID: \"446689ad-06c9-4bed-8695-97d52ae46e15\") " Dec 05 20:57:50 crc kubenswrapper[4885]: I1205 20:57:50.836426 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446689ad-06c9-4bed-8695-97d52ae46e15-catalog-content\") pod \"446689ad-06c9-4bed-8695-97d52ae46e15\" (UID: \"446689ad-06c9-4bed-8695-97d52ae46e15\") " Dec 05 20:57:50 crc kubenswrapper[4885]: I1205 20:57:50.838086 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/446689ad-06c9-4bed-8695-97d52ae46e15-utilities" (OuterVolumeSpecName: "utilities") pod "446689ad-06c9-4bed-8695-97d52ae46e15" (UID: "446689ad-06c9-4bed-8695-97d52ae46e15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:57:50 crc kubenswrapper[4885]: I1205 20:57:50.842233 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/446689ad-06c9-4bed-8695-97d52ae46e15-kube-api-access-76jp9" (OuterVolumeSpecName: "kube-api-access-76jp9") pod "446689ad-06c9-4bed-8695-97d52ae46e15" (UID: "446689ad-06c9-4bed-8695-97d52ae46e15"). InnerVolumeSpecName "kube-api-access-76jp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:57:50 crc kubenswrapper[4885]: I1205 20:57:50.906311 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/446689ad-06c9-4bed-8695-97d52ae46e15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "446689ad-06c9-4bed-8695-97d52ae46e15" (UID: "446689ad-06c9-4bed-8695-97d52ae46e15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:57:50 crc kubenswrapper[4885]: I1205 20:57:50.938956 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76jp9\" (UniqueName: \"kubernetes.io/projected/446689ad-06c9-4bed-8695-97d52ae46e15-kube-api-access-76jp9\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:50 crc kubenswrapper[4885]: I1205 20:57:50.939012 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446689ad-06c9-4bed-8695-97d52ae46e15-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:50 crc kubenswrapper[4885]: I1205 20:57:50.939056 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446689ad-06c9-4bed-8695-97d52ae46e15-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:51 crc kubenswrapper[4885]: I1205 20:57:51.060478 4885 generic.go:334] "Generic (PLEG): container finished" podID="446689ad-06c9-4bed-8695-97d52ae46e15" containerID="3d4cc26c07027c26eb29f69cb7e019dc9a91a0bba2ce7497ab8adbf80fd09684" exitCode=0 Dec 05 20:57:51 crc kubenswrapper[4885]: I1205 20:57:51.060529 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqhld" event={"ID":"446689ad-06c9-4bed-8695-97d52ae46e15","Type":"ContainerDied","Data":"3d4cc26c07027c26eb29f69cb7e019dc9a91a0bba2ce7497ab8adbf80fd09684"} Dec 05 20:57:51 crc kubenswrapper[4885]: I1205 20:57:51.060567 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqhld" event={"ID":"446689ad-06c9-4bed-8695-97d52ae46e15","Type":"ContainerDied","Data":"e755160f1d77596105d44315bbb16668713485858d320aa531f9534acdac5c83"} Dec 05 20:57:51 crc kubenswrapper[4885]: I1205 20:57:51.060598 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqhld" Dec 05 20:57:51 crc kubenswrapper[4885]: I1205 20:57:51.060598 4885 scope.go:117] "RemoveContainer" containerID="3d4cc26c07027c26eb29f69cb7e019dc9a91a0bba2ce7497ab8adbf80fd09684" Dec 05 20:57:51 crc kubenswrapper[4885]: I1205 20:57:51.113884 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqhld"] Dec 05 20:57:51 crc kubenswrapper[4885]: I1205 20:57:51.116907 4885 scope.go:117] "RemoveContainer" containerID="74dad162b8994c5d97b8cbc4e2a8452d07a28b95d0610d9c5001345e08b6fd9b" Dec 05 20:57:51 crc kubenswrapper[4885]: I1205 20:57:51.128762 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vqhld"] Dec 05 20:57:51 crc kubenswrapper[4885]: I1205 20:57:51.139880 4885 scope.go:117] "RemoveContainer" containerID="f4b691fa2c9d802c8be8795dcaaca94852a2b249c4b34f0e4af9779882dc27e5" Dec 05 20:57:51 crc kubenswrapper[4885]: I1205 20:57:51.187188 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="446689ad-06c9-4bed-8695-97d52ae46e15" path="/var/lib/kubelet/pods/446689ad-06c9-4bed-8695-97d52ae46e15/volumes" Dec 05 20:57:51 crc kubenswrapper[4885]: I1205 20:57:51.189460 4885 scope.go:117] "RemoveContainer" containerID="3d4cc26c07027c26eb29f69cb7e019dc9a91a0bba2ce7497ab8adbf80fd09684" Dec 05 20:57:51 crc kubenswrapper[4885]: E1205 20:57:51.190314 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d4cc26c07027c26eb29f69cb7e019dc9a91a0bba2ce7497ab8adbf80fd09684\": container with ID starting with 3d4cc26c07027c26eb29f69cb7e019dc9a91a0bba2ce7497ab8adbf80fd09684 not found: ID does not exist" containerID="3d4cc26c07027c26eb29f69cb7e019dc9a91a0bba2ce7497ab8adbf80fd09684" Dec 05 20:57:51 crc kubenswrapper[4885]: I1205 20:57:51.190345 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4cc26c07027c26eb29f69cb7e019dc9a91a0bba2ce7497ab8adbf80fd09684"} err="failed to get container status \"3d4cc26c07027c26eb29f69cb7e019dc9a91a0bba2ce7497ab8adbf80fd09684\": rpc error: code = NotFound desc = could not find container \"3d4cc26c07027c26eb29f69cb7e019dc9a91a0bba2ce7497ab8adbf80fd09684\": container with ID starting with 3d4cc26c07027c26eb29f69cb7e019dc9a91a0bba2ce7497ab8adbf80fd09684 not found: ID does not exist" Dec 05 20:57:51 crc kubenswrapper[4885]: I1205 20:57:51.190364 4885 scope.go:117] "RemoveContainer" containerID="74dad162b8994c5d97b8cbc4e2a8452d07a28b95d0610d9c5001345e08b6fd9b" Dec 05 20:57:51 crc kubenswrapper[4885]: E1205 20:57:51.190626 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74dad162b8994c5d97b8cbc4e2a8452d07a28b95d0610d9c5001345e08b6fd9b\": container with ID starting with 74dad162b8994c5d97b8cbc4e2a8452d07a28b95d0610d9c5001345e08b6fd9b not found: ID does not exist" containerID="74dad162b8994c5d97b8cbc4e2a8452d07a28b95d0610d9c5001345e08b6fd9b" Dec 05 20:57:51 crc kubenswrapper[4885]: I1205 20:57:51.190650 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74dad162b8994c5d97b8cbc4e2a8452d07a28b95d0610d9c5001345e08b6fd9b"} err="failed to get container status \"74dad162b8994c5d97b8cbc4e2a8452d07a28b95d0610d9c5001345e08b6fd9b\": rpc error: code = NotFound desc = could not find container \"74dad162b8994c5d97b8cbc4e2a8452d07a28b95d0610d9c5001345e08b6fd9b\": container with ID starting with 74dad162b8994c5d97b8cbc4e2a8452d07a28b95d0610d9c5001345e08b6fd9b not found: ID does not exist" Dec 05 20:57:51 crc kubenswrapper[4885]: I1205 20:57:51.190665 4885 scope.go:117] "RemoveContainer" containerID="f4b691fa2c9d802c8be8795dcaaca94852a2b249c4b34f0e4af9779882dc27e5" Dec 05 20:57:51 crc kubenswrapper[4885]: E1205 20:57:51.190965 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4b691fa2c9d802c8be8795dcaaca94852a2b249c4b34f0e4af9779882dc27e5\": container with ID starting with f4b691fa2c9d802c8be8795dcaaca94852a2b249c4b34f0e4af9779882dc27e5 not found: ID does not exist" containerID="f4b691fa2c9d802c8be8795dcaaca94852a2b249c4b34f0e4af9779882dc27e5" Dec 05 20:57:51 crc kubenswrapper[4885]: I1205 20:57:51.190987 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4b691fa2c9d802c8be8795dcaaca94852a2b249c4b34f0e4af9779882dc27e5"} err="failed to get container status \"f4b691fa2c9d802c8be8795dcaaca94852a2b249c4b34f0e4af9779882dc27e5\": rpc error: code = NotFound desc = could not find container \"f4b691fa2c9d802c8be8795dcaaca94852a2b249c4b34f0e4af9779882dc27e5\": container with ID starting with f4b691fa2c9d802c8be8795dcaaca94852a2b249c4b34f0e4af9779882dc27e5 not found: ID does not exist" Dec 05 20:57:52 crc kubenswrapper[4885]: I1205 20:57:52.172425 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:57:52 crc kubenswrapper[4885]: E1205 20:57:52.172789 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.453938 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rpmxn"] Dec 05 20:57:55 crc kubenswrapper[4885]: E1205 20:57:55.455070 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446689ad-06c9-4bed-8695-97d52ae46e15" containerName="registry-server" Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.455094 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="446689ad-06c9-4bed-8695-97d52ae46e15" containerName="registry-server" Dec 05 20:57:55 crc kubenswrapper[4885]: E1205 20:57:55.455133 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446689ad-06c9-4bed-8695-97d52ae46e15" containerName="extract-utilities" Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.455147 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="446689ad-06c9-4bed-8695-97d52ae46e15" containerName="extract-utilities" Dec 05 20:57:55 crc kubenswrapper[4885]: E1205 20:57:55.455177 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446689ad-06c9-4bed-8695-97d52ae46e15" containerName="extract-content" Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.455190 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="446689ad-06c9-4bed-8695-97d52ae46e15" containerName="extract-content" Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.455506 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="446689ad-06c9-4bed-8695-97d52ae46e15" containerName="registry-server" Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.457639 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.464845 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rpmxn"] Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.631710 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66332460-7087-413a-99a8-5240703b067a-catalog-content\") pod \"certified-operators-rpmxn\" (UID: \"66332460-7087-413a-99a8-5240703b067a\") " pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.632210 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvtdc\" (UniqueName: \"kubernetes.io/projected/66332460-7087-413a-99a8-5240703b067a-kube-api-access-fvtdc\") pod \"certified-operators-rpmxn\" (UID: \"66332460-7087-413a-99a8-5240703b067a\") " pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.632311 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66332460-7087-413a-99a8-5240703b067a-utilities\") pod \"certified-operators-rpmxn\" (UID: \"66332460-7087-413a-99a8-5240703b067a\") " pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.734476 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66332460-7087-413a-99a8-5240703b067a-catalog-content\") pod \"certified-operators-rpmxn\" (UID: \"66332460-7087-413a-99a8-5240703b067a\") " pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.734673 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvtdc\" (UniqueName: \"kubernetes.io/projected/66332460-7087-413a-99a8-5240703b067a-kube-api-access-fvtdc\") pod \"certified-operators-rpmxn\" (UID: \"66332460-7087-413a-99a8-5240703b067a\") " pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.734723 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66332460-7087-413a-99a8-5240703b067a-utilities\") pod \"certified-operators-rpmxn\" (UID: \"66332460-7087-413a-99a8-5240703b067a\") " pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.734960 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66332460-7087-413a-99a8-5240703b067a-catalog-content\") pod \"certified-operators-rpmxn\" (UID: \"66332460-7087-413a-99a8-5240703b067a\") " pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.735114 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66332460-7087-413a-99a8-5240703b067a-utilities\") pod \"certified-operators-rpmxn\" (UID: \"66332460-7087-413a-99a8-5240703b067a\") " pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.762004 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvtdc\" (UniqueName: \"kubernetes.io/projected/66332460-7087-413a-99a8-5240703b067a-kube-api-access-fvtdc\") pod \"certified-operators-rpmxn\" (UID: \"66332460-7087-413a-99a8-5240703b067a\") " pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:57:55 crc kubenswrapper[4885]: I1205 20:57:55.800708 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:57:56 crc kubenswrapper[4885]: I1205 20:57:56.331160 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rpmxn"] Dec 05 20:57:57 crc kubenswrapper[4885]: I1205 20:57:57.122797 4885 generic.go:334] "Generic (PLEG): container finished" podID="66332460-7087-413a-99a8-5240703b067a" containerID="156b470e15308146a5fdef022bb45a21f778d4c80647bf2359b958770be83580" exitCode=0 Dec 05 20:57:57 crc kubenswrapper[4885]: I1205 20:57:57.123634 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpmxn" event={"ID":"66332460-7087-413a-99a8-5240703b067a","Type":"ContainerDied","Data":"156b470e15308146a5fdef022bb45a21f778d4c80647bf2359b958770be83580"} Dec 05 20:57:57 crc kubenswrapper[4885]: I1205 20:57:57.123747 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpmxn" event={"ID":"66332460-7087-413a-99a8-5240703b067a","Type":"ContainerStarted","Data":"47f25eb73010a18f1325fcafa0b2f225d95fc19883baafe110fb14228850a721"} Dec 05 20:57:58 crc kubenswrapper[4885]: I1205 20:57:58.134787 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpmxn" event={"ID":"66332460-7087-413a-99a8-5240703b067a","Type":"ContainerStarted","Data":"12bfc5f92007d88d06ec1c917c72d181adbbb3308db9c386a907bf5031ecd153"} Dec 05 20:57:59 crc kubenswrapper[4885]: I1205 20:57:59.149899 4885 generic.go:334] "Generic (PLEG): container finished" podID="66332460-7087-413a-99a8-5240703b067a" containerID="12bfc5f92007d88d06ec1c917c72d181adbbb3308db9c386a907bf5031ecd153" exitCode=0 Dec 05 20:57:59 crc kubenswrapper[4885]: I1205 20:57:59.149948 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpmxn" event={"ID":"66332460-7087-413a-99a8-5240703b067a","Type":"ContainerDied","Data":"12bfc5f92007d88d06ec1c917c72d181adbbb3308db9c386a907bf5031ecd153"} Dec 05 20:58:00 crc kubenswrapper[4885]: I1205 20:58:00.163495 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpmxn" event={"ID":"66332460-7087-413a-99a8-5240703b067a","Type":"ContainerStarted","Data":"eee84cfc97bfa3c97d02087ae6b5068035cd3498d97b9c18708b4faf0133f871"} Dec 05 20:58:00 crc kubenswrapper[4885]: I1205 20:58:00.183878 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rpmxn" podStartSLOduration=2.724899327 podStartE2EDuration="5.183861496s" podCreationTimestamp="2025-12-05 20:57:55 +0000 UTC" firstStartedPulling="2025-12-05 20:57:57.124527356 +0000 UTC m=+3142.421343017" lastFinishedPulling="2025-12-05 20:57:59.583489525 +0000 UTC m=+3144.880305186" observedRunningTime="2025-12-05 20:58:00.181204003 +0000 UTC m=+3145.478019664" watchObservedRunningTime="2025-12-05 20:58:00.183861496 +0000 UTC m=+3145.480677157" Dec 05 20:58:03 crc kubenswrapper[4885]: I1205 20:58:03.173100 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:58:03 crc kubenswrapper[4885]: E1205 20:58:03.173690 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:58:05 crc kubenswrapper[4885]: I1205 20:58:05.801067 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:58:05 crc kubenswrapper[4885]: I1205 20:58:05.801721 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:58:05 crc kubenswrapper[4885]: I1205 20:58:05.859906 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:58:06 crc kubenswrapper[4885]: I1205 20:58:06.285339 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:58:06 crc kubenswrapper[4885]: I1205 20:58:06.351605 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rpmxn"] Dec 05 20:58:08 crc kubenswrapper[4885]: I1205 20:58:08.232152 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rpmxn" podUID="66332460-7087-413a-99a8-5240703b067a" containerName="registry-server" containerID="cri-o://eee84cfc97bfa3c97d02087ae6b5068035cd3498d97b9c18708b4faf0133f871" gracePeriod=2 Dec 05 20:58:09 crc kubenswrapper[4885]: I1205 20:58:09.243475 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpmxn" event={"ID":"66332460-7087-413a-99a8-5240703b067a","Type":"ContainerDied","Data":"eee84cfc97bfa3c97d02087ae6b5068035cd3498d97b9c18708b4faf0133f871"} Dec 05 20:58:09 crc kubenswrapper[4885]: I1205 20:58:09.243290 4885 generic.go:334] "Generic (PLEG): container finished" podID="66332460-7087-413a-99a8-5240703b067a" containerID="eee84cfc97bfa3c97d02087ae6b5068035cd3498d97b9c18708b4faf0133f871" exitCode=0 Dec 05 20:58:09 crc kubenswrapper[4885]: I1205 20:58:09.244132 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpmxn" event={"ID":"66332460-7087-413a-99a8-5240703b067a","Type":"ContainerDied","Data":"47f25eb73010a18f1325fcafa0b2f225d95fc19883baafe110fb14228850a721"} Dec 05 20:58:09 crc kubenswrapper[4885]: I1205 20:58:09.244145 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47f25eb73010a18f1325fcafa0b2f225d95fc19883baafe110fb14228850a721" Dec 05 20:58:09 crc kubenswrapper[4885]: I1205 20:58:09.253454 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:58:09 crc kubenswrapper[4885]: I1205 20:58:09.444571 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66332460-7087-413a-99a8-5240703b067a-utilities\") pod \"66332460-7087-413a-99a8-5240703b067a\" (UID: \"66332460-7087-413a-99a8-5240703b067a\") " Dec 05 20:58:09 crc kubenswrapper[4885]: I1205 20:58:09.444636 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66332460-7087-413a-99a8-5240703b067a-catalog-content\") pod \"66332460-7087-413a-99a8-5240703b067a\" (UID: \"66332460-7087-413a-99a8-5240703b067a\") " Dec 05 20:58:09 crc kubenswrapper[4885]: I1205 20:58:09.444738 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvtdc\" (UniqueName: \"kubernetes.io/projected/66332460-7087-413a-99a8-5240703b067a-kube-api-access-fvtdc\") pod \"66332460-7087-413a-99a8-5240703b067a\" (UID: \"66332460-7087-413a-99a8-5240703b067a\") " Dec 05 20:58:09 crc kubenswrapper[4885]: I1205 20:58:09.445497 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66332460-7087-413a-99a8-5240703b067a-utilities" (OuterVolumeSpecName: "utilities") pod "66332460-7087-413a-99a8-5240703b067a" (UID: "66332460-7087-413a-99a8-5240703b067a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:58:09 crc kubenswrapper[4885]: I1205 20:58:09.455505 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66332460-7087-413a-99a8-5240703b067a-kube-api-access-fvtdc" (OuterVolumeSpecName: "kube-api-access-fvtdc") pod "66332460-7087-413a-99a8-5240703b067a" (UID: "66332460-7087-413a-99a8-5240703b067a"). InnerVolumeSpecName "kube-api-access-fvtdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:58:09 crc kubenswrapper[4885]: I1205 20:58:09.501412 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66332460-7087-413a-99a8-5240703b067a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66332460-7087-413a-99a8-5240703b067a" (UID: "66332460-7087-413a-99a8-5240703b067a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:58:09 crc kubenswrapper[4885]: I1205 20:58:09.546361 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvtdc\" (UniqueName: \"kubernetes.io/projected/66332460-7087-413a-99a8-5240703b067a-kube-api-access-fvtdc\") on node \"crc\" DevicePath \"\"" Dec 05 20:58:09 crc kubenswrapper[4885]: I1205 20:58:09.546405 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66332460-7087-413a-99a8-5240703b067a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:58:09 crc kubenswrapper[4885]: I1205 20:58:09.546422 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66332460-7087-413a-99a8-5240703b067a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:58:10 crc kubenswrapper[4885]: I1205 20:58:10.250873 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rpmxn" Dec 05 20:58:10 crc kubenswrapper[4885]: I1205 20:58:10.287754 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rpmxn"] Dec 05 20:58:10 crc kubenswrapper[4885]: I1205 20:58:10.296160 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rpmxn"] Dec 05 20:58:11 crc kubenswrapper[4885]: I1205 20:58:11.182629 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66332460-7087-413a-99a8-5240703b067a" path="/var/lib/kubelet/pods/66332460-7087-413a-99a8-5240703b067a/volumes" Dec 05 20:58:18 crc kubenswrapper[4885]: I1205 20:58:18.172774 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:58:18 crc kubenswrapper[4885]: E1205 20:58:18.173758 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:58:31 crc kubenswrapper[4885]: I1205 20:58:31.172896 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:58:31 crc kubenswrapper[4885]: E1205 20:58:31.173951 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:58:42 crc kubenswrapper[4885]: I1205 20:58:42.172577 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:58:42 crc kubenswrapper[4885]: E1205 20:58:42.174079 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:58:55 crc kubenswrapper[4885]: I1205 20:58:55.181440 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:58:55 crc kubenswrapper[4885]: E1205 20:58:55.182286 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:59:06 crc kubenswrapper[4885]: I1205 20:59:06.173869 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:59:06 crc kubenswrapper[4885]: E1205 20:59:06.174964 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 20:59:17 crc kubenswrapper[4885]: I1205 20:59:17.173006 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 20:59:17 crc kubenswrapper[4885]: I1205 20:59:17.908401 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"55f15dab397240767972c0a0157905fc91315a34d8e0fb0cfbfb80eaa3e064ff"} Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.146923 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5"] Dec 05 21:00:00 crc kubenswrapper[4885]: E1205 21:00:00.147930 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66332460-7087-413a-99a8-5240703b067a" containerName="extract-utilities" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.147946 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="66332460-7087-413a-99a8-5240703b067a" containerName="extract-utilities" Dec 05 21:00:00 crc kubenswrapper[4885]: E1205 21:00:00.147974 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66332460-7087-413a-99a8-5240703b067a" containerName="registry-server" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.147981 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="66332460-7087-413a-99a8-5240703b067a" containerName="registry-server" Dec 05 21:00:00 crc kubenswrapper[4885]: E1205 21:00:00.148036 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66332460-7087-413a-99a8-5240703b067a" containerName="extract-content" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.148042 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="66332460-7087-413a-99a8-5240703b067a" containerName="extract-content" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.148232 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="66332460-7087-413a-99a8-5240703b067a" containerName="registry-server" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.148953 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.151302 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.151995 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.157687 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5"] Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.206319 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnbh9\" (UniqueName: \"kubernetes.io/projected/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-kube-api-access-hnbh9\") pod \"collect-profiles-29416140-wj9v5\" (UID: \"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.206502 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-secret-volume\") pod \"collect-profiles-29416140-wj9v5\" (UID: \"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.206548 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-config-volume\") pod \"collect-profiles-29416140-wj9v5\" (UID: \"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.308220 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-secret-volume\") pod \"collect-profiles-29416140-wj9v5\" (UID: \"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.308293 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-config-volume\") pod \"collect-profiles-29416140-wj9v5\" (UID: \"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.308448 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnbh9\" (UniqueName: \"kubernetes.io/projected/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-kube-api-access-hnbh9\") pod \"collect-profiles-29416140-wj9v5\" (UID: \"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.309498 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-config-volume\") pod \"collect-profiles-29416140-wj9v5\" (UID: \"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.320707 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-secret-volume\") pod \"collect-profiles-29416140-wj9v5\" (UID: \"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.328086 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnbh9\" (UniqueName: \"kubernetes.io/projected/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-kube-api-access-hnbh9\") pod \"collect-profiles-29416140-wj9v5\" (UID: \"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.472120 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" Dec 05 21:00:00 crc kubenswrapper[4885]: I1205 21:00:00.925634 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5"] Dec 05 21:00:01 crc kubenswrapper[4885]: I1205 21:00:01.338091 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" event={"ID":"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8","Type":"ContainerStarted","Data":"1264072758feb7d428ea9c0f5cb6cc4452d4769455125122330e1d3430e12860"} Dec 05 21:00:01 crc kubenswrapper[4885]: I1205 21:00:01.338342 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" event={"ID":"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8","Type":"ContainerStarted","Data":"0760ed21607b8df93c3038e44f340740548f7dd926e505aea1f438ea893d6e6c"} Dec 05 21:00:01 crc kubenswrapper[4885]: I1205 21:00:01.366872 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" podStartSLOduration=1.366853979 podStartE2EDuration="1.366853979s" podCreationTimestamp="2025-12-05 21:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:00:01.362446741 +0000 UTC m=+3266.659262412" watchObservedRunningTime="2025-12-05 21:00:01.366853979 +0000 UTC m=+3266.663669640" Dec 05 21:00:02 crc kubenswrapper[4885]: I1205 21:00:02.349128 4885 generic.go:334] "Generic (PLEG): container finished" podID="cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8" containerID="1264072758feb7d428ea9c0f5cb6cc4452d4769455125122330e1d3430e12860" exitCode=0 Dec 05 21:00:02 crc kubenswrapper[4885]: I1205 21:00:02.349145 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" event={"ID":"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8","Type":"ContainerDied","Data":"1264072758feb7d428ea9c0f5cb6cc4452d4769455125122330e1d3430e12860"} Dec 05 21:00:03 crc kubenswrapper[4885]: I1205 21:00:03.990518 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" Dec 05 21:00:04 crc kubenswrapper[4885]: I1205 21:00:04.083953 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnbh9\" (UniqueName: \"kubernetes.io/projected/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-kube-api-access-hnbh9\") pod \"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8\" (UID: \"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8\") " Dec 05 21:00:04 crc kubenswrapper[4885]: I1205 21:00:04.084116 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-config-volume\") pod \"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8\" (UID: \"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8\") " Dec 05 21:00:04 crc kubenswrapper[4885]: I1205 21:00:04.084167 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-secret-volume\") pod \"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8\" (UID: \"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8\") " Dec 05 21:00:04 crc kubenswrapper[4885]: I1205 21:00:04.084820 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-config-volume" (OuterVolumeSpecName: "config-volume") pod "cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8" (UID: "cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:04 crc kubenswrapper[4885]: I1205 21:00:04.096381 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8" (UID: "cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:00:04 crc kubenswrapper[4885]: I1205 21:00:04.096538 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-kube-api-access-hnbh9" (OuterVolumeSpecName: "kube-api-access-hnbh9") pod "cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8" (UID: "cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8"). InnerVolumeSpecName "kube-api-access-hnbh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:04 crc kubenswrapper[4885]: I1205 21:00:04.185638 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:04 crc kubenswrapper[4885]: I1205 21:00:04.185878 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnbh9\" (UniqueName: \"kubernetes.io/projected/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-kube-api-access-hnbh9\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:04 crc kubenswrapper[4885]: I1205 21:00:04.185975 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:04 crc kubenswrapper[4885]: I1205 21:00:04.368120 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" event={"ID":"cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8","Type":"ContainerDied","Data":"0760ed21607b8df93c3038e44f340740548f7dd926e505aea1f438ea893d6e6c"} Dec 05 21:00:04 crc kubenswrapper[4885]: I1205 21:00:04.368524 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0760ed21607b8df93c3038e44f340740548f7dd926e505aea1f438ea893d6e6c" Dec 05 21:00:04 crc kubenswrapper[4885]: I1205 21:00:04.368155 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-wj9v5" Dec 05 21:00:04 crc kubenswrapper[4885]: I1205 21:00:04.505691 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f"] Dec 05 21:00:04 crc kubenswrapper[4885]: I1205 21:00:04.513848 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416095-pbz7f"] Dec 05 21:00:05 crc kubenswrapper[4885]: I1205 21:00:05.182823 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1caf49a1-5103-4ab8-b2a8-8d395fe66c43" path="/var/lib/kubelet/pods/1caf49a1-5103-4ab8-b2a8-8d395fe66c43/volumes" Dec 05 21:00:20 crc kubenswrapper[4885]: I1205 21:00:20.220978 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tj7g8"] Dec 05 21:00:20 crc kubenswrapper[4885]: E1205 21:00:20.221804 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8" containerName="collect-profiles" Dec 05 21:00:20 crc kubenswrapper[4885]: I1205 21:00:20.221819 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8" containerName="collect-profiles" Dec 05 21:00:20 crc kubenswrapper[4885]: I1205 21:00:20.222058 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb0c6fb1-1e1d-4bc6-b696-455fdb30ccb8" containerName="collect-profiles" Dec 05 21:00:20 crc kubenswrapper[4885]: I1205 21:00:20.223272 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:20 crc kubenswrapper[4885]: I1205 21:00:20.236187 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tj7g8"] Dec 05 21:00:20 crc kubenswrapper[4885]: I1205 21:00:20.296971 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bd1d40e-b5da-412e-9e22-903f86f62371-utilities\") pod \"redhat-marketplace-tj7g8\" (UID: \"5bd1d40e-b5da-412e-9e22-903f86f62371\") " pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:20 crc kubenswrapper[4885]: I1205 21:00:20.297130 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xqhl\" (UniqueName: \"kubernetes.io/projected/5bd1d40e-b5da-412e-9e22-903f86f62371-kube-api-access-2xqhl\") pod \"redhat-marketplace-tj7g8\" (UID: \"5bd1d40e-b5da-412e-9e22-903f86f62371\") " pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:20 crc kubenswrapper[4885]: I1205 21:00:20.297309 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bd1d40e-b5da-412e-9e22-903f86f62371-catalog-content\") pod \"redhat-marketplace-tj7g8\" (UID: \"5bd1d40e-b5da-412e-9e22-903f86f62371\") " pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:20 crc kubenswrapper[4885]: I1205 21:00:20.398666 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bd1d40e-b5da-412e-9e22-903f86f62371-catalog-content\") pod \"redhat-marketplace-tj7g8\" (UID: \"5bd1d40e-b5da-412e-9e22-903f86f62371\") " pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:20 crc kubenswrapper[4885]: I1205 21:00:20.398773 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bd1d40e-b5da-412e-9e22-903f86f62371-utilities\") pod \"redhat-marketplace-tj7g8\" (UID: \"5bd1d40e-b5da-412e-9e22-903f86f62371\") " pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:20 crc kubenswrapper[4885]: I1205 21:00:20.398808 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xqhl\" (UniqueName: \"kubernetes.io/projected/5bd1d40e-b5da-412e-9e22-903f86f62371-kube-api-access-2xqhl\") pod \"redhat-marketplace-tj7g8\" (UID: \"5bd1d40e-b5da-412e-9e22-903f86f62371\") " pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:20 crc kubenswrapper[4885]: I1205 21:00:20.399652 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bd1d40e-b5da-412e-9e22-903f86f62371-catalog-content\") pod \"redhat-marketplace-tj7g8\" (UID: \"5bd1d40e-b5da-412e-9e22-903f86f62371\") " pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:20 crc kubenswrapper[4885]: I1205 21:00:20.399911 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bd1d40e-b5da-412e-9e22-903f86f62371-utilities\") pod \"redhat-marketplace-tj7g8\" (UID: \"5bd1d40e-b5da-412e-9e22-903f86f62371\") " pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:20 crc kubenswrapper[4885]: I1205 21:00:20.427391 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xqhl\" (UniqueName: \"kubernetes.io/projected/5bd1d40e-b5da-412e-9e22-903f86f62371-kube-api-access-2xqhl\") pod \"redhat-marketplace-tj7g8\" (UID: \"5bd1d40e-b5da-412e-9e22-903f86f62371\") " pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:20 crc kubenswrapper[4885]: I1205 21:00:20.564794 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:21 crc kubenswrapper[4885]: I1205 21:00:21.213261 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tj7g8"] Dec 05 21:00:21 crc kubenswrapper[4885]: I1205 21:00:21.519636 4885 generic.go:334] "Generic (PLEG): container finished" podID="5bd1d40e-b5da-412e-9e22-903f86f62371" containerID="2d3e3c3c001f712e1e45e5c5cbad15a160586c9ed2aad3ffd167310b48d2df29" exitCode=0 Dec 05 21:00:21 crc kubenswrapper[4885]: I1205 21:00:21.519693 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tj7g8" event={"ID":"5bd1d40e-b5da-412e-9e22-903f86f62371","Type":"ContainerDied","Data":"2d3e3c3c001f712e1e45e5c5cbad15a160586c9ed2aad3ffd167310b48d2df29"} Dec 05 21:00:21 crc kubenswrapper[4885]: I1205 21:00:21.520000 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tj7g8" event={"ID":"5bd1d40e-b5da-412e-9e22-903f86f62371","Type":"ContainerStarted","Data":"57b33cb36d7b1c5b4eb53e6ed22662ba1cb349481e0f67b7eb768a20d9037d5c"} Dec 05 21:00:22 crc kubenswrapper[4885]: I1205 21:00:22.531153 4885 generic.go:334] "Generic (PLEG): container finished" podID="5bd1d40e-b5da-412e-9e22-903f86f62371" containerID="da20218deed6040d0a3e57973a80bd5b58ee59d17861dfc67c0a9602990d6aae" exitCode=0 Dec 05 21:00:22 crc kubenswrapper[4885]: I1205 21:00:22.531250 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tj7g8" event={"ID":"5bd1d40e-b5da-412e-9e22-903f86f62371","Type":"ContainerDied","Data":"da20218deed6040d0a3e57973a80bd5b58ee59d17861dfc67c0a9602990d6aae"} Dec 05 21:00:23 crc kubenswrapper[4885]: I1205 21:00:23.571764 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tj7g8" event={"ID":"5bd1d40e-b5da-412e-9e22-903f86f62371","Type":"ContainerStarted","Data":"b0ef6eeb98f915d5388493b99ee49bbb520149b69f0cdea6ae9797836c5ae8cf"} Dec 05 21:00:23 crc kubenswrapper[4885]: I1205 21:00:23.600798 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tj7g8" podStartSLOduration=2.149236202 podStartE2EDuration="3.600776158s" podCreationTimestamp="2025-12-05 21:00:20 +0000 UTC" firstStartedPulling="2025-12-05 21:00:21.521553237 +0000 UTC m=+3286.818368918" lastFinishedPulling="2025-12-05 21:00:22.973093203 +0000 UTC m=+3288.269908874" observedRunningTime="2025-12-05 21:00:23.591148727 +0000 UTC m=+3288.887964388" watchObservedRunningTime="2025-12-05 21:00:23.600776158 +0000 UTC m=+3288.897591819" Dec 05 21:00:30 crc kubenswrapper[4885]: I1205 21:00:30.565541 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:30 crc kubenswrapper[4885]: I1205 21:00:30.566326 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:30 crc kubenswrapper[4885]: I1205 21:00:30.645708 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:30 crc kubenswrapper[4885]: I1205 21:00:30.721002 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:30 crc kubenswrapper[4885]: I1205 21:00:30.892404 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tj7g8"] Dec 05 21:00:32 crc kubenswrapper[4885]: I1205 21:00:32.659159 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tj7g8" podUID="5bd1d40e-b5da-412e-9e22-903f86f62371" containerName="registry-server" containerID="cri-o://b0ef6eeb98f915d5388493b99ee49bbb520149b69f0cdea6ae9797836c5ae8cf" gracePeriod=2 Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.126723 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.258751 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xqhl\" (UniqueName: \"kubernetes.io/projected/5bd1d40e-b5da-412e-9e22-903f86f62371-kube-api-access-2xqhl\") pod \"5bd1d40e-b5da-412e-9e22-903f86f62371\" (UID: \"5bd1d40e-b5da-412e-9e22-903f86f62371\") " Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.258881 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bd1d40e-b5da-412e-9e22-903f86f62371-utilities\") pod \"5bd1d40e-b5da-412e-9e22-903f86f62371\" (UID: \"5bd1d40e-b5da-412e-9e22-903f86f62371\") " Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.259034 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bd1d40e-b5da-412e-9e22-903f86f62371-catalog-content\") pod \"5bd1d40e-b5da-412e-9e22-903f86f62371\" (UID: \"5bd1d40e-b5da-412e-9e22-903f86f62371\") " Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.259922 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bd1d40e-b5da-412e-9e22-903f86f62371-utilities" (OuterVolumeSpecName: "utilities") pod "5bd1d40e-b5da-412e-9e22-903f86f62371" (UID: "5bd1d40e-b5da-412e-9e22-903f86f62371"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.265357 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bd1d40e-b5da-412e-9e22-903f86f62371-kube-api-access-2xqhl" (OuterVolumeSpecName: "kube-api-access-2xqhl") pod "5bd1d40e-b5da-412e-9e22-903f86f62371" (UID: "5bd1d40e-b5da-412e-9e22-903f86f62371"). InnerVolumeSpecName "kube-api-access-2xqhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.276940 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bd1d40e-b5da-412e-9e22-903f86f62371-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5bd1d40e-b5da-412e-9e22-903f86f62371" (UID: "5bd1d40e-b5da-412e-9e22-903f86f62371"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.361886 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bd1d40e-b5da-412e-9e22-903f86f62371-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.362234 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xqhl\" (UniqueName: \"kubernetes.io/projected/5bd1d40e-b5da-412e-9e22-903f86f62371-kube-api-access-2xqhl\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.362248 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bd1d40e-b5da-412e-9e22-903f86f62371-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.671999 4885 generic.go:334] "Generic (PLEG): container finished" podID="5bd1d40e-b5da-412e-9e22-903f86f62371" containerID="b0ef6eeb98f915d5388493b99ee49bbb520149b69f0cdea6ae9797836c5ae8cf" exitCode=0 Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.672068 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tj7g8" Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.672068 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tj7g8" event={"ID":"5bd1d40e-b5da-412e-9e22-903f86f62371","Type":"ContainerDied","Data":"b0ef6eeb98f915d5388493b99ee49bbb520149b69f0cdea6ae9797836c5ae8cf"} Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.672131 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tj7g8" event={"ID":"5bd1d40e-b5da-412e-9e22-903f86f62371","Type":"ContainerDied","Data":"57b33cb36d7b1c5b4eb53e6ed22662ba1cb349481e0f67b7eb768a20d9037d5c"} Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.672154 4885 scope.go:117] "RemoveContainer" containerID="b0ef6eeb98f915d5388493b99ee49bbb520149b69f0cdea6ae9797836c5ae8cf" Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.707298 4885 scope.go:117] "RemoveContainer" containerID="da20218deed6040d0a3e57973a80bd5b58ee59d17861dfc67c0a9602990d6aae" Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.713191 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tj7g8"] Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.744720 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tj7g8"] Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.752858 4885 scope.go:117] "RemoveContainer" containerID="2d3e3c3c001f712e1e45e5c5cbad15a160586c9ed2aad3ffd167310b48d2df29" Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.798159 4885 scope.go:117] "RemoveContainer" containerID="b0ef6eeb98f915d5388493b99ee49bbb520149b69f0cdea6ae9797836c5ae8cf" Dec 05 21:00:33 crc kubenswrapper[4885]: E1205 21:00:33.798657 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0ef6eeb98f915d5388493b99ee49bbb520149b69f0cdea6ae9797836c5ae8cf\": container with ID starting with b0ef6eeb98f915d5388493b99ee49bbb520149b69f0cdea6ae9797836c5ae8cf not found: ID does not exist" containerID="b0ef6eeb98f915d5388493b99ee49bbb520149b69f0cdea6ae9797836c5ae8cf" Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.798698 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0ef6eeb98f915d5388493b99ee49bbb520149b69f0cdea6ae9797836c5ae8cf"} err="failed to get container status \"b0ef6eeb98f915d5388493b99ee49bbb520149b69f0cdea6ae9797836c5ae8cf\": rpc error: code = NotFound desc = could not find container \"b0ef6eeb98f915d5388493b99ee49bbb520149b69f0cdea6ae9797836c5ae8cf\": container with ID starting with b0ef6eeb98f915d5388493b99ee49bbb520149b69f0cdea6ae9797836c5ae8cf not found: ID does not exist" Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.798724 4885 scope.go:117] "RemoveContainer" containerID="da20218deed6040d0a3e57973a80bd5b58ee59d17861dfc67c0a9602990d6aae" Dec 05 21:00:33 crc kubenswrapper[4885]: E1205 21:00:33.799139 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da20218deed6040d0a3e57973a80bd5b58ee59d17861dfc67c0a9602990d6aae\": container with ID starting with da20218deed6040d0a3e57973a80bd5b58ee59d17861dfc67c0a9602990d6aae not found: ID does not exist" containerID="da20218deed6040d0a3e57973a80bd5b58ee59d17861dfc67c0a9602990d6aae" Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.799167 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da20218deed6040d0a3e57973a80bd5b58ee59d17861dfc67c0a9602990d6aae"} err="failed to get container status \"da20218deed6040d0a3e57973a80bd5b58ee59d17861dfc67c0a9602990d6aae\": rpc error: code = NotFound desc = could not find container \"da20218deed6040d0a3e57973a80bd5b58ee59d17861dfc67c0a9602990d6aae\": container with ID starting with da20218deed6040d0a3e57973a80bd5b58ee59d17861dfc67c0a9602990d6aae not found: ID does not exist" Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.799184 4885 scope.go:117] "RemoveContainer" containerID="2d3e3c3c001f712e1e45e5c5cbad15a160586c9ed2aad3ffd167310b48d2df29" Dec 05 21:00:33 crc kubenswrapper[4885]: E1205 21:00:33.799477 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d3e3c3c001f712e1e45e5c5cbad15a160586c9ed2aad3ffd167310b48d2df29\": container with ID starting with 2d3e3c3c001f712e1e45e5c5cbad15a160586c9ed2aad3ffd167310b48d2df29 not found: ID does not exist" containerID="2d3e3c3c001f712e1e45e5c5cbad15a160586c9ed2aad3ffd167310b48d2df29" Dec 05 21:00:33 crc kubenswrapper[4885]: I1205 21:00:33.799506 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d3e3c3c001f712e1e45e5c5cbad15a160586c9ed2aad3ffd167310b48d2df29"} err="failed to get container status \"2d3e3c3c001f712e1e45e5c5cbad15a160586c9ed2aad3ffd167310b48d2df29\": rpc error: code = NotFound desc = could not find container \"2d3e3c3c001f712e1e45e5c5cbad15a160586c9ed2aad3ffd167310b48d2df29\": container with ID starting with 2d3e3c3c001f712e1e45e5c5cbad15a160586c9ed2aad3ffd167310b48d2df29 not found: ID does not exist" Dec 05 21:00:35 crc kubenswrapper[4885]: I1205 21:00:35.189307 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bd1d40e-b5da-412e-9e22-903f86f62371" path="/var/lib/kubelet/pods/5bd1d40e-b5da-412e-9e22-903f86f62371/volumes" Dec 05 21:00:56 crc kubenswrapper[4885]: I1205 21:00:56.080289 4885 scope.go:117] "RemoveContainer" containerID="cfee1ab840ed566ecba7ee5d477a3e165b3ab9c508be3f2a408dbc71dcacc39c" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.164646 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29416141-4x6cq"] Dec 05 21:01:00 crc kubenswrapper[4885]: E1205 21:01:00.165821 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bd1d40e-b5da-412e-9e22-903f86f62371" containerName="registry-server" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.165841 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd1d40e-b5da-412e-9e22-903f86f62371" containerName="registry-server" Dec 05 21:01:00 crc kubenswrapper[4885]: E1205 21:01:00.165879 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bd1d40e-b5da-412e-9e22-903f86f62371" containerName="extract-utilities" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.165888 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd1d40e-b5da-412e-9e22-903f86f62371" containerName="extract-utilities" Dec 05 21:01:00 crc kubenswrapper[4885]: E1205 21:01:00.165903 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bd1d40e-b5da-412e-9e22-903f86f62371" containerName="extract-content" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.165910 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd1d40e-b5da-412e-9e22-903f86f62371" containerName="extract-content" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.166163 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bd1d40e-b5da-412e-9e22-903f86f62371" containerName="registry-server" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.167089 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416141-4x6cq" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.211617 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416141-4x6cq"] Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.307896 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-combined-ca-bundle\") pod \"keystone-cron-29416141-4x6cq\" (UID: \"39c99e0c-b27c-4703-a5c0-a380c33df665\") " pod="openstack/keystone-cron-29416141-4x6cq" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.307961 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-fernet-keys\") pod \"keystone-cron-29416141-4x6cq\" (UID: \"39c99e0c-b27c-4703-a5c0-a380c33df665\") " pod="openstack/keystone-cron-29416141-4x6cq" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.308372 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-config-data\") pod \"keystone-cron-29416141-4x6cq\" (UID: \"39c99e0c-b27c-4703-a5c0-a380c33df665\") " pod="openstack/keystone-cron-29416141-4x6cq" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.308619 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kwgv\" (UniqueName: \"kubernetes.io/projected/39c99e0c-b27c-4703-a5c0-a380c33df665-kube-api-access-8kwgv\") pod \"keystone-cron-29416141-4x6cq\" (UID: \"39c99e0c-b27c-4703-a5c0-a380c33df665\") " pod="openstack/keystone-cron-29416141-4x6cq" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.410731 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-combined-ca-bundle\") pod \"keystone-cron-29416141-4x6cq\" (UID: \"39c99e0c-b27c-4703-a5c0-a380c33df665\") " pod="openstack/keystone-cron-29416141-4x6cq" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.410854 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-fernet-keys\") pod \"keystone-cron-29416141-4x6cq\" (UID: \"39c99e0c-b27c-4703-a5c0-a380c33df665\") " pod="openstack/keystone-cron-29416141-4x6cq" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.411050 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-config-data\") pod \"keystone-cron-29416141-4x6cq\" (UID: \"39c99e0c-b27c-4703-a5c0-a380c33df665\") " pod="openstack/keystone-cron-29416141-4x6cq" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.411160 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kwgv\" (UniqueName: \"kubernetes.io/projected/39c99e0c-b27c-4703-a5c0-a380c33df665-kube-api-access-8kwgv\") pod \"keystone-cron-29416141-4x6cq\" (UID: \"39c99e0c-b27c-4703-a5c0-a380c33df665\") " pod="openstack/keystone-cron-29416141-4x6cq" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.425176 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-fernet-keys\") pod \"keystone-cron-29416141-4x6cq\" (UID: \"39c99e0c-b27c-4703-a5c0-a380c33df665\") " pod="openstack/keystone-cron-29416141-4x6cq" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.425191 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-combined-ca-bundle\") pod \"keystone-cron-29416141-4x6cq\" (UID: \"39c99e0c-b27c-4703-a5c0-a380c33df665\") " pod="openstack/keystone-cron-29416141-4x6cq" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.426004 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-config-data\") pod \"keystone-cron-29416141-4x6cq\" (UID: \"39c99e0c-b27c-4703-a5c0-a380c33df665\") " pod="openstack/keystone-cron-29416141-4x6cq" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.430862 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kwgv\" (UniqueName: \"kubernetes.io/projected/39c99e0c-b27c-4703-a5c0-a380c33df665-kube-api-access-8kwgv\") pod \"keystone-cron-29416141-4x6cq\" (UID: \"39c99e0c-b27c-4703-a5c0-a380c33df665\") " pod="openstack/keystone-cron-29416141-4x6cq" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.493574 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416141-4x6cq" Dec 05 21:01:00 crc kubenswrapper[4885]: I1205 21:01:00.930887 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416141-4x6cq"] Dec 05 21:01:01 crc kubenswrapper[4885]: I1205 21:01:01.949702 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416141-4x6cq" event={"ID":"39c99e0c-b27c-4703-a5c0-a380c33df665","Type":"ContainerStarted","Data":"3fbce5e1dd88f0e0de4267f311839e1956bacf35e4238ae06adee656d8bda831"} Dec 05 21:01:01 crc kubenswrapper[4885]: I1205 21:01:01.950191 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416141-4x6cq" event={"ID":"39c99e0c-b27c-4703-a5c0-a380c33df665","Type":"ContainerStarted","Data":"43e8b966b99b919edf93b249fd4dfc8bb410006db2ca0f411798d10a8f2a37a2"} Dec 05 21:01:03 crc kubenswrapper[4885]: I1205 21:01:03.971709 4885 generic.go:334] "Generic (PLEG): container finished" podID="39c99e0c-b27c-4703-a5c0-a380c33df665" containerID="3fbce5e1dd88f0e0de4267f311839e1956bacf35e4238ae06adee656d8bda831" exitCode=0 Dec 05 21:01:03 crc kubenswrapper[4885]: I1205 21:01:03.971812 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416141-4x6cq" event={"ID":"39c99e0c-b27c-4703-a5c0-a380c33df665","Type":"ContainerDied","Data":"3fbce5e1dd88f0e0de4267f311839e1956bacf35e4238ae06adee656d8bda831"} Dec 05 21:01:05 crc kubenswrapper[4885]: I1205 21:01:05.360374 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416141-4x6cq" Dec 05 21:01:05 crc kubenswrapper[4885]: I1205 21:01:05.515956 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-config-data\") pod \"39c99e0c-b27c-4703-a5c0-a380c33df665\" (UID: \"39c99e0c-b27c-4703-a5c0-a380c33df665\") " Dec 05 21:01:05 crc kubenswrapper[4885]: I1205 21:01:05.516135 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kwgv\" (UniqueName: \"kubernetes.io/projected/39c99e0c-b27c-4703-a5c0-a380c33df665-kube-api-access-8kwgv\") pod \"39c99e0c-b27c-4703-a5c0-a380c33df665\" (UID: \"39c99e0c-b27c-4703-a5c0-a380c33df665\") " Dec 05 21:01:05 crc kubenswrapper[4885]: I1205 21:01:05.516195 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-combined-ca-bundle\") pod \"39c99e0c-b27c-4703-a5c0-a380c33df665\" (UID: \"39c99e0c-b27c-4703-a5c0-a380c33df665\") " Dec 05 21:01:05 crc kubenswrapper[4885]: I1205 21:01:05.516282 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-fernet-keys\") pod \"39c99e0c-b27c-4703-a5c0-a380c33df665\" (UID: \"39c99e0c-b27c-4703-a5c0-a380c33df665\") " Dec 05 21:01:05 crc kubenswrapper[4885]: I1205 21:01:05.522978 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39c99e0c-b27c-4703-a5c0-a380c33df665-kube-api-access-8kwgv" (OuterVolumeSpecName: "kube-api-access-8kwgv") pod "39c99e0c-b27c-4703-a5c0-a380c33df665" (UID: "39c99e0c-b27c-4703-a5c0-a380c33df665"). InnerVolumeSpecName "kube-api-access-8kwgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:05 crc kubenswrapper[4885]: I1205 21:01:05.523350 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "39c99e0c-b27c-4703-a5c0-a380c33df665" (UID: "39c99e0c-b27c-4703-a5c0-a380c33df665"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:05 crc kubenswrapper[4885]: I1205 21:01:05.550641 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39c99e0c-b27c-4703-a5c0-a380c33df665" (UID: "39c99e0c-b27c-4703-a5c0-a380c33df665"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:05 crc kubenswrapper[4885]: I1205 21:01:05.575544 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-config-data" (OuterVolumeSpecName: "config-data") pod "39c99e0c-b27c-4703-a5c0-a380c33df665" (UID: "39c99e0c-b27c-4703-a5c0-a380c33df665"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:05 crc kubenswrapper[4885]: I1205 21:01:05.618868 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:05 crc kubenswrapper[4885]: I1205 21:01:05.618908 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kwgv\" (UniqueName: \"kubernetes.io/projected/39c99e0c-b27c-4703-a5c0-a380c33df665-kube-api-access-8kwgv\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:05 crc kubenswrapper[4885]: I1205 21:01:05.618921 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:05 crc kubenswrapper[4885]: I1205 21:01:05.618930 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/39c99e0c-b27c-4703-a5c0-a380c33df665-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:05 crc kubenswrapper[4885]: I1205 21:01:05.991986 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416141-4x6cq" event={"ID":"39c99e0c-b27c-4703-a5c0-a380c33df665","Type":"ContainerDied","Data":"43e8b966b99b919edf93b249fd4dfc8bb410006db2ca0f411798d10a8f2a37a2"} Dec 05 21:01:05 crc kubenswrapper[4885]: I1205 21:01:05.992057 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43e8b966b99b919edf93b249fd4dfc8bb410006db2ca0f411798d10a8f2a37a2" Dec 05 21:01:05 crc kubenswrapper[4885]: I1205 21:01:05.992082 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416141-4x6cq" Dec 05 21:01:11 crc kubenswrapper[4885]: I1205 21:01:11.461297 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9698n"] Dec 05 21:01:11 crc kubenswrapper[4885]: E1205 21:01:11.462367 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c99e0c-b27c-4703-a5c0-a380c33df665" containerName="keystone-cron" Dec 05 21:01:11 crc kubenswrapper[4885]: I1205 21:01:11.462386 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c99e0c-b27c-4703-a5c0-a380c33df665" containerName="keystone-cron" Dec 05 21:01:11 crc kubenswrapper[4885]: I1205 21:01:11.462613 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c99e0c-b27c-4703-a5c0-a380c33df665" containerName="keystone-cron" Dec 05 21:01:11 crc kubenswrapper[4885]: I1205 21:01:11.464278 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:11 crc kubenswrapper[4885]: I1205 21:01:11.479654 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9698n"] Dec 05 21:01:11 crc kubenswrapper[4885]: I1205 21:01:11.526882 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b7e99b9-a9d2-4311-8694-615a33efc01a-catalog-content\") pod \"redhat-operators-9698n\" (UID: \"1b7e99b9-a9d2-4311-8694-615a33efc01a\") " pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:11 crc kubenswrapper[4885]: I1205 21:01:11.526966 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b7e99b9-a9d2-4311-8694-615a33efc01a-utilities\") pod \"redhat-operators-9698n\" (UID: \"1b7e99b9-a9d2-4311-8694-615a33efc01a\") " pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:11 crc kubenswrapper[4885]: I1205 21:01:11.527115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzpcn\" (UniqueName: \"kubernetes.io/projected/1b7e99b9-a9d2-4311-8694-615a33efc01a-kube-api-access-qzpcn\") pod \"redhat-operators-9698n\" (UID: \"1b7e99b9-a9d2-4311-8694-615a33efc01a\") " pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:11 crc kubenswrapper[4885]: I1205 21:01:11.628864 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b7e99b9-a9d2-4311-8694-615a33efc01a-catalog-content\") pod \"redhat-operators-9698n\" (UID: \"1b7e99b9-a9d2-4311-8694-615a33efc01a\") " pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:11 crc kubenswrapper[4885]: I1205 21:01:11.628955 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b7e99b9-a9d2-4311-8694-615a33efc01a-utilities\") pod \"redhat-operators-9698n\" (UID: \"1b7e99b9-a9d2-4311-8694-615a33efc01a\") " pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:11 crc kubenswrapper[4885]: I1205 21:01:11.629085 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzpcn\" (UniqueName: \"kubernetes.io/projected/1b7e99b9-a9d2-4311-8694-615a33efc01a-kube-api-access-qzpcn\") pod \"redhat-operators-9698n\" (UID: \"1b7e99b9-a9d2-4311-8694-615a33efc01a\") " pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:11 crc kubenswrapper[4885]: I1205 21:01:11.629474 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b7e99b9-a9d2-4311-8694-615a33efc01a-catalog-content\") pod \"redhat-operators-9698n\" (UID: \"1b7e99b9-a9d2-4311-8694-615a33efc01a\") " pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:11 crc kubenswrapper[4885]: I1205 21:01:11.629555 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b7e99b9-a9d2-4311-8694-615a33efc01a-utilities\") pod \"redhat-operators-9698n\" (UID: \"1b7e99b9-a9d2-4311-8694-615a33efc01a\") " pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:11 crc kubenswrapper[4885]: I1205 21:01:11.662944 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzpcn\" (UniqueName: \"kubernetes.io/projected/1b7e99b9-a9d2-4311-8694-615a33efc01a-kube-api-access-qzpcn\") pod \"redhat-operators-9698n\" (UID: \"1b7e99b9-a9d2-4311-8694-615a33efc01a\") " pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:11 crc kubenswrapper[4885]: I1205 21:01:11.797146 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:12 crc kubenswrapper[4885]: I1205 21:01:12.285317 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9698n"] Dec 05 21:01:13 crc kubenswrapper[4885]: I1205 21:01:13.056606 4885 generic.go:334] "Generic (PLEG): container finished" podID="1b7e99b9-a9d2-4311-8694-615a33efc01a" containerID="11474e77ff9c14a8e6f1807bf1d9b0f6f303b6995c9cd31833e5515a037f2c2d" exitCode=0 Dec 05 21:01:13 crc kubenswrapper[4885]: I1205 21:01:13.056801 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9698n" event={"ID":"1b7e99b9-a9d2-4311-8694-615a33efc01a","Type":"ContainerDied","Data":"11474e77ff9c14a8e6f1807bf1d9b0f6f303b6995c9cd31833e5515a037f2c2d"} Dec 05 21:01:13 crc kubenswrapper[4885]: I1205 21:01:13.056903 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9698n" event={"ID":"1b7e99b9-a9d2-4311-8694-615a33efc01a","Type":"ContainerStarted","Data":"bde2ec1177fe1a9ea1d0150e527c29bb0354124d12c12e074e3a11be1d96f5a3"} Dec 05 21:01:14 crc kubenswrapper[4885]: I1205 21:01:14.068132 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9698n" event={"ID":"1b7e99b9-a9d2-4311-8694-615a33efc01a","Type":"ContainerStarted","Data":"8a964d0d75601494ad5f2857fcb4c402ef1db6c2368092d40573a89f7a35a155"} Dec 05 21:01:16 crc kubenswrapper[4885]: I1205 21:01:16.091167 4885 generic.go:334] "Generic (PLEG): container finished" podID="1b7e99b9-a9d2-4311-8694-615a33efc01a" containerID="8a964d0d75601494ad5f2857fcb4c402ef1db6c2368092d40573a89f7a35a155" exitCode=0 Dec 05 21:01:16 crc kubenswrapper[4885]: I1205 21:01:16.091225 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9698n" event={"ID":"1b7e99b9-a9d2-4311-8694-615a33efc01a","Type":"ContainerDied","Data":"8a964d0d75601494ad5f2857fcb4c402ef1db6c2368092d40573a89f7a35a155"} Dec 05 21:01:18 crc kubenswrapper[4885]: I1205 21:01:18.114704 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9698n" event={"ID":"1b7e99b9-a9d2-4311-8694-615a33efc01a","Type":"ContainerStarted","Data":"0a61e17a250fbe58f2d9869e944fc1f16512a81c1a89736c7e95acb440673455"} Dec 05 21:01:18 crc kubenswrapper[4885]: I1205 21:01:18.136753 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9698n" podStartSLOduration=2.973468406 podStartE2EDuration="7.136728369s" podCreationTimestamp="2025-12-05 21:01:11 +0000 UTC" firstStartedPulling="2025-12-05 21:01:13.059171295 +0000 UTC m=+3338.355986956" lastFinishedPulling="2025-12-05 21:01:17.222431258 +0000 UTC m=+3342.519246919" observedRunningTime="2025-12-05 21:01:18.132889448 +0000 UTC m=+3343.429705119" watchObservedRunningTime="2025-12-05 21:01:18.136728369 +0000 UTC m=+3343.433544050" Dec 05 21:01:21 crc kubenswrapper[4885]: I1205 21:01:21.797623 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:21 crc kubenswrapper[4885]: I1205 21:01:21.798252 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:22 crc kubenswrapper[4885]: I1205 21:01:22.858030 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9698n" podUID="1b7e99b9-a9d2-4311-8694-615a33efc01a" containerName="registry-server" probeResult="failure" output=< Dec 05 21:01:22 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Dec 05 21:01:22 crc kubenswrapper[4885]: > Dec 05 21:01:31 crc kubenswrapper[4885]: I1205 21:01:31.868949 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:31 crc kubenswrapper[4885]: I1205 21:01:31.937578 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:32 crc kubenswrapper[4885]: I1205 21:01:32.114194 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9698n"] Dec 05 21:01:33 crc kubenswrapper[4885]: I1205 21:01:33.243706 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9698n" podUID="1b7e99b9-a9d2-4311-8694-615a33efc01a" containerName="registry-server" containerID="cri-o://0a61e17a250fbe58f2d9869e944fc1f16512a81c1a89736c7e95acb440673455" gracePeriod=2 Dec 05 21:01:33 crc kubenswrapper[4885]: I1205 21:01:33.794200 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:33 crc kubenswrapper[4885]: I1205 21:01:33.946081 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzpcn\" (UniqueName: \"kubernetes.io/projected/1b7e99b9-a9d2-4311-8694-615a33efc01a-kube-api-access-qzpcn\") pod \"1b7e99b9-a9d2-4311-8694-615a33efc01a\" (UID: \"1b7e99b9-a9d2-4311-8694-615a33efc01a\") " Dec 05 21:01:33 crc kubenswrapper[4885]: I1205 21:01:33.946264 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b7e99b9-a9d2-4311-8694-615a33efc01a-catalog-content\") pod \"1b7e99b9-a9d2-4311-8694-615a33efc01a\" (UID: \"1b7e99b9-a9d2-4311-8694-615a33efc01a\") " Dec 05 21:01:33 crc kubenswrapper[4885]: I1205 21:01:33.946338 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b7e99b9-a9d2-4311-8694-615a33efc01a-utilities\") pod \"1b7e99b9-a9d2-4311-8694-615a33efc01a\" (UID: \"1b7e99b9-a9d2-4311-8694-615a33efc01a\") " Dec 05 21:01:33 crc kubenswrapper[4885]: I1205 21:01:33.947278 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b7e99b9-a9d2-4311-8694-615a33efc01a-utilities" (OuterVolumeSpecName: "utilities") pod "1b7e99b9-a9d2-4311-8694-615a33efc01a" (UID: "1b7e99b9-a9d2-4311-8694-615a33efc01a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:01:33 crc kubenswrapper[4885]: I1205 21:01:33.951468 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b7e99b9-a9d2-4311-8694-615a33efc01a-kube-api-access-qzpcn" (OuterVolumeSpecName: "kube-api-access-qzpcn") pod "1b7e99b9-a9d2-4311-8694-615a33efc01a" (UID: "1b7e99b9-a9d2-4311-8694-615a33efc01a"). InnerVolumeSpecName "kube-api-access-qzpcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.048409 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzpcn\" (UniqueName: \"kubernetes.io/projected/1b7e99b9-a9d2-4311-8694-615a33efc01a-kube-api-access-qzpcn\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.048454 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b7e99b9-a9d2-4311-8694-615a33efc01a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.063944 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b7e99b9-a9d2-4311-8694-615a33efc01a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b7e99b9-a9d2-4311-8694-615a33efc01a" (UID: "1b7e99b9-a9d2-4311-8694-615a33efc01a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.150852 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b7e99b9-a9d2-4311-8694-615a33efc01a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.254990 4885 generic.go:334] "Generic (PLEG): container finished" podID="1b7e99b9-a9d2-4311-8694-615a33efc01a" containerID="0a61e17a250fbe58f2d9869e944fc1f16512a81c1a89736c7e95acb440673455" exitCode=0 Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.255043 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9698n" event={"ID":"1b7e99b9-a9d2-4311-8694-615a33efc01a","Type":"ContainerDied","Data":"0a61e17a250fbe58f2d9869e944fc1f16512a81c1a89736c7e95acb440673455"} Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.255070 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9698n" event={"ID":"1b7e99b9-a9d2-4311-8694-615a33efc01a","Type":"ContainerDied","Data":"bde2ec1177fe1a9ea1d0150e527c29bb0354124d12c12e074e3a11be1d96f5a3"} Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.255087 4885 scope.go:117] "RemoveContainer" containerID="0a61e17a250fbe58f2d9869e944fc1f16512a81c1a89736c7e95acb440673455" Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.255200 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9698n" Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.283732 4885 scope.go:117] "RemoveContainer" containerID="8a964d0d75601494ad5f2857fcb4c402ef1db6c2368092d40573a89f7a35a155" Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.307215 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9698n"] Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.321833 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9698n"] Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.323530 4885 scope.go:117] "RemoveContainer" containerID="11474e77ff9c14a8e6f1807bf1d9b0f6f303b6995c9cd31833e5515a037f2c2d" Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.367095 4885 scope.go:117] "RemoveContainer" containerID="0a61e17a250fbe58f2d9869e944fc1f16512a81c1a89736c7e95acb440673455" Dec 05 21:01:34 crc kubenswrapper[4885]: E1205 21:01:34.368003 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a61e17a250fbe58f2d9869e944fc1f16512a81c1a89736c7e95acb440673455\": container with ID starting with 0a61e17a250fbe58f2d9869e944fc1f16512a81c1a89736c7e95acb440673455 not found: ID does not exist" containerID="0a61e17a250fbe58f2d9869e944fc1f16512a81c1a89736c7e95acb440673455" Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.368101 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a61e17a250fbe58f2d9869e944fc1f16512a81c1a89736c7e95acb440673455"} err="failed to get container status \"0a61e17a250fbe58f2d9869e944fc1f16512a81c1a89736c7e95acb440673455\": rpc error: code = NotFound desc = could not find container \"0a61e17a250fbe58f2d9869e944fc1f16512a81c1a89736c7e95acb440673455\": container with ID starting with 0a61e17a250fbe58f2d9869e944fc1f16512a81c1a89736c7e95acb440673455 not found: ID does not exist" Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.368131 4885 scope.go:117] "RemoveContainer" containerID="8a964d0d75601494ad5f2857fcb4c402ef1db6c2368092d40573a89f7a35a155" Dec 05 21:01:34 crc kubenswrapper[4885]: E1205 21:01:34.368459 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a964d0d75601494ad5f2857fcb4c402ef1db6c2368092d40573a89f7a35a155\": container with ID starting with 8a964d0d75601494ad5f2857fcb4c402ef1db6c2368092d40573a89f7a35a155 not found: ID does not exist" containerID="8a964d0d75601494ad5f2857fcb4c402ef1db6c2368092d40573a89f7a35a155" Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.368490 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a964d0d75601494ad5f2857fcb4c402ef1db6c2368092d40573a89f7a35a155"} err="failed to get container status \"8a964d0d75601494ad5f2857fcb4c402ef1db6c2368092d40573a89f7a35a155\": rpc error: code = NotFound desc = could not find container \"8a964d0d75601494ad5f2857fcb4c402ef1db6c2368092d40573a89f7a35a155\": container with ID starting with 8a964d0d75601494ad5f2857fcb4c402ef1db6c2368092d40573a89f7a35a155 not found: ID does not exist" Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.368511 4885 scope.go:117] "RemoveContainer" containerID="11474e77ff9c14a8e6f1807bf1d9b0f6f303b6995c9cd31833e5515a037f2c2d" Dec 05 21:01:34 crc kubenswrapper[4885]: E1205 21:01:34.368784 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11474e77ff9c14a8e6f1807bf1d9b0f6f303b6995c9cd31833e5515a037f2c2d\": container with ID starting with 11474e77ff9c14a8e6f1807bf1d9b0f6f303b6995c9cd31833e5515a037f2c2d not found: ID does not exist" containerID="11474e77ff9c14a8e6f1807bf1d9b0f6f303b6995c9cd31833e5515a037f2c2d" Dec 05 21:01:34 crc kubenswrapper[4885]: I1205 21:01:34.368816 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11474e77ff9c14a8e6f1807bf1d9b0f6f303b6995c9cd31833e5515a037f2c2d"} err="failed to get container status \"11474e77ff9c14a8e6f1807bf1d9b0f6f303b6995c9cd31833e5515a037f2c2d\": rpc error: code = NotFound desc = could not find container \"11474e77ff9c14a8e6f1807bf1d9b0f6f303b6995c9cd31833e5515a037f2c2d\": container with ID starting with 11474e77ff9c14a8e6f1807bf1d9b0f6f303b6995c9cd31833e5515a037f2c2d not found: ID does not exist" Dec 05 21:01:35 crc kubenswrapper[4885]: I1205 21:01:35.194743 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b7e99b9-a9d2-4311-8694-615a33efc01a" path="/var/lib/kubelet/pods/1b7e99b9-a9d2-4311-8694-615a33efc01a/volumes" Dec 05 21:01:46 crc kubenswrapper[4885]: I1205 21:01:46.631261 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:01:46 crc kubenswrapper[4885]: I1205 21:01:46.631928 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:02:16 crc kubenswrapper[4885]: I1205 21:02:16.631263 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:02:16 crc kubenswrapper[4885]: I1205 21:02:16.631819 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:02:21 crc kubenswrapper[4885]: I1205 21:02:21.751466 4885 generic.go:334] "Generic (PLEG): container finished" podID="9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d" containerID="09416ae515902e13b08a2249a3254576caeadbb53906043e24db61143a5e86e4" exitCode=0 Dec 05 21:02:21 crc kubenswrapper[4885]: I1205 21:02:21.751561 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d","Type":"ContainerDied","Data":"09416ae515902e13b08a2249a3254576caeadbb53906043e24db61143a5e86e4"} Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.236754 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.333365 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-ssh-key\") pod \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.333586 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-test-operator-ephemeral-temporary\") pod \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.333688 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-test-operator-ephemeral-workdir\") pod \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.334395 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d" (UID: "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.334442 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-openstack-config-secret\") pod \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.334470 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-config-data\") pod \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.334484 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-ca-certs\") pod \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.334522 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-openstack-config\") pod \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.334575 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.335257 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-config-data" (OuterVolumeSpecName: "config-data") pod "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d" (UID: "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.335308 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4c49\" (UniqueName: \"kubernetes.io/projected/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-kube-api-access-c4c49\") pod \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\" (UID: \"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d\") " Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.336431 4885 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.336456 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.339316 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "test-operator-logs") pod "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d" (UID: "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.339961 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-kube-api-access-c4c49" (OuterVolumeSpecName: "kube-api-access-c4c49") pod "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d" (UID: "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d"). InnerVolumeSpecName "kube-api-access-c4c49". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.341678 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d" (UID: "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.362775 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d" (UID: "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.369597 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d" (UID: "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.370557 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d" (UID: "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.382229 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d" (UID: "9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.438592 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.438628 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4c49\" (UniqueName: \"kubernetes.io/projected/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-kube-api-access-c4c49\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.438642 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.438651 4885 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.438661 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.438669 4885 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.438677 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.457676 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.540692 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.776974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d","Type":"ContainerDied","Data":"1af558ce4ec9c628cda0d4cbad542e5f4ab020803ffb0bd6464e66e9c5662683"} Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.777336 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1af558ce4ec9c628cda0d4cbad542e5f4ab020803ffb0bd6464e66e9c5662683" Dec 05 21:02:23 crc kubenswrapper[4885]: I1205 21:02:23.777137 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 21:02:26 crc kubenswrapper[4885]: I1205 21:02:26.774574 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 21:02:26 crc kubenswrapper[4885]: E1205 21:02:26.775325 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b7e99b9-a9d2-4311-8694-615a33efc01a" containerName="registry-server" Dec 05 21:02:26 crc kubenswrapper[4885]: I1205 21:02:26.775338 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b7e99b9-a9d2-4311-8694-615a33efc01a" containerName="registry-server" Dec 05 21:02:26 crc kubenswrapper[4885]: E1205 21:02:26.775364 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b7e99b9-a9d2-4311-8694-615a33efc01a" containerName="extract-utilities" Dec 05 21:02:26 crc kubenswrapper[4885]: I1205 21:02:26.775370 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b7e99b9-a9d2-4311-8694-615a33efc01a" containerName="extract-utilities" Dec 05 21:02:26 crc kubenswrapper[4885]: E1205 21:02:26.775391 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d" containerName="tempest-tests-tempest-tests-runner" Dec 05 21:02:26 crc kubenswrapper[4885]: I1205 21:02:26.775397 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d" containerName="tempest-tests-tempest-tests-runner" Dec 05 21:02:26 crc kubenswrapper[4885]: E1205 21:02:26.775415 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b7e99b9-a9d2-4311-8694-615a33efc01a" containerName="extract-content" Dec 05 21:02:26 crc kubenswrapper[4885]: I1205 21:02:26.775421 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b7e99b9-a9d2-4311-8694-615a33efc01a" containerName="extract-content" Dec 05 21:02:26 crc kubenswrapper[4885]: I1205 21:02:26.775587 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d" containerName="tempest-tests-tempest-tests-runner" Dec 05 21:02:26 crc kubenswrapper[4885]: I1205 21:02:26.775600 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b7e99b9-a9d2-4311-8694-615a33efc01a" containerName="registry-server" Dec 05 21:02:26 crc kubenswrapper[4885]: I1205 21:02:26.776189 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:02:26 crc kubenswrapper[4885]: I1205 21:02:26.782342 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-fl7jz" Dec 05 21:02:26 crc kubenswrapper[4885]: I1205 21:02:26.786398 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 21:02:26 crc kubenswrapper[4885]: I1205 21:02:26.911469 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8faf97ea-3453-4e96-8a29-a7a30aec54c1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:02:26 crc kubenswrapper[4885]: I1205 21:02:26.911617 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm6cj\" (UniqueName: \"kubernetes.io/projected/8faf97ea-3453-4e96-8a29-a7a30aec54c1-kube-api-access-mm6cj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8faf97ea-3453-4e96-8a29-a7a30aec54c1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:02:27 crc kubenswrapper[4885]: I1205 21:02:27.013178 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm6cj\" (UniqueName: \"kubernetes.io/projected/8faf97ea-3453-4e96-8a29-a7a30aec54c1-kube-api-access-mm6cj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8faf97ea-3453-4e96-8a29-a7a30aec54c1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:02:27 crc kubenswrapper[4885]: I1205 21:02:27.013317 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8faf97ea-3453-4e96-8a29-a7a30aec54c1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:02:27 crc kubenswrapper[4885]: I1205 21:02:27.013846 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8faf97ea-3453-4e96-8a29-a7a30aec54c1\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:02:27 crc kubenswrapper[4885]: I1205 21:02:27.036768 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm6cj\" (UniqueName: \"kubernetes.io/projected/8faf97ea-3453-4e96-8a29-a7a30aec54c1-kube-api-access-mm6cj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8faf97ea-3453-4e96-8a29-a7a30aec54c1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:02:27 crc kubenswrapper[4885]: I1205 21:02:27.048763 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8faf97ea-3453-4e96-8a29-a7a30aec54c1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:02:27 crc kubenswrapper[4885]: I1205 21:02:27.094417 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 21:02:27 crc kubenswrapper[4885]: I1205 21:02:27.555976 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 21:02:27 crc kubenswrapper[4885]: I1205 21:02:27.824718 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"8faf97ea-3453-4e96-8a29-a7a30aec54c1","Type":"ContainerStarted","Data":"21e24688fdc74886b9659f1ab7537643c69c25eeaa0da771da1e01cee314db33"} Dec 05 21:02:29 crc kubenswrapper[4885]: I1205 21:02:29.850398 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"8faf97ea-3453-4e96-8a29-a7a30aec54c1","Type":"ContainerStarted","Data":"729587f1a419065c0618d4fea009e515771de4738aeec927de533b77c91f30e1"} Dec 05 21:02:46 crc kubenswrapper[4885]: I1205 21:02:46.631066 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:02:46 crc kubenswrapper[4885]: I1205 21:02:46.631757 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:02:46 crc kubenswrapper[4885]: I1205 21:02:46.631820 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 21:02:46 crc kubenswrapper[4885]: I1205 21:02:46.632720 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55f15dab397240767972c0a0157905fc91315a34d8e0fb0cfbfb80eaa3e064ff"} pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:02:46 crc kubenswrapper[4885]: I1205 21:02:46.632788 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" containerID="cri-o://55f15dab397240767972c0a0157905fc91315a34d8e0fb0cfbfb80eaa3e064ff" gracePeriod=600 Dec 05 21:02:47 crc kubenswrapper[4885]: I1205 21:02:47.015580 4885 generic.go:334] "Generic (PLEG): container finished" podID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerID="55f15dab397240767972c0a0157905fc91315a34d8e0fb0cfbfb80eaa3e064ff" exitCode=0 Dec 05 21:02:47 crc kubenswrapper[4885]: I1205 21:02:47.015842 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerDied","Data":"55f15dab397240767972c0a0157905fc91315a34d8e0fb0cfbfb80eaa3e064ff"} Dec 05 21:02:47 crc kubenswrapper[4885]: I1205 21:02:47.015866 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847"} Dec 05 21:02:47 crc kubenswrapper[4885]: I1205 21:02:47.015883 4885 scope.go:117] "RemoveContainer" containerID="b30a5c345bcdf9a2443e64f3277faece54fd4d04798bfe02e39cdcfea9d1552d" Dec 05 21:02:47 crc kubenswrapper[4885]: I1205 21:02:47.043486 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=19.919686083 podStartE2EDuration="21.043469992s" podCreationTimestamp="2025-12-05 21:02:26 +0000 UTC" firstStartedPulling="2025-12-05 21:02:27.57341983 +0000 UTC m=+3412.870235491" lastFinishedPulling="2025-12-05 21:02:28.697203719 +0000 UTC m=+3413.994019400" observedRunningTime="2025-12-05 21:02:29.864425277 +0000 UTC m=+3415.161240938" watchObservedRunningTime="2025-12-05 21:02:47.043469992 +0000 UTC m=+3432.340285653" Dec 05 21:02:50 crc kubenswrapper[4885]: I1205 21:02:50.657644 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-69vbh/must-gather-xzb9c"] Dec 05 21:02:50 crc kubenswrapper[4885]: I1205 21:02:50.665446 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69vbh/must-gather-xzb9c" Dec 05 21:02:50 crc kubenswrapper[4885]: I1205 21:02:50.667907 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-69vbh"/"kube-root-ca.crt" Dec 05 21:02:50 crc kubenswrapper[4885]: I1205 21:02:50.668056 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-69vbh"/"openshift-service-ca.crt" Dec 05 21:02:50 crc kubenswrapper[4885]: I1205 21:02:50.679654 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-69vbh/must-gather-xzb9c"] Dec 05 21:02:50 crc kubenswrapper[4885]: I1205 21:02:50.778541 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw2z5\" (UniqueName: \"kubernetes.io/projected/2b92cdd5-01a2-4e80-b003-4e02f77eb87c-kube-api-access-bw2z5\") pod \"must-gather-xzb9c\" (UID: \"2b92cdd5-01a2-4e80-b003-4e02f77eb87c\") " pod="openshift-must-gather-69vbh/must-gather-xzb9c" Dec 05 21:02:50 crc kubenswrapper[4885]: I1205 21:02:50.778658 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2b92cdd5-01a2-4e80-b003-4e02f77eb87c-must-gather-output\") pod \"must-gather-xzb9c\" (UID: \"2b92cdd5-01a2-4e80-b003-4e02f77eb87c\") " pod="openshift-must-gather-69vbh/must-gather-xzb9c" Dec 05 21:02:50 crc kubenswrapper[4885]: I1205 21:02:50.880852 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw2z5\" (UniqueName: \"kubernetes.io/projected/2b92cdd5-01a2-4e80-b003-4e02f77eb87c-kube-api-access-bw2z5\") pod \"must-gather-xzb9c\" (UID: \"2b92cdd5-01a2-4e80-b003-4e02f77eb87c\") " pod="openshift-must-gather-69vbh/must-gather-xzb9c" Dec 05 21:02:50 crc kubenswrapper[4885]: I1205 21:02:50.881437 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2b92cdd5-01a2-4e80-b003-4e02f77eb87c-must-gather-output\") pod \"must-gather-xzb9c\" (UID: \"2b92cdd5-01a2-4e80-b003-4e02f77eb87c\") " pod="openshift-must-gather-69vbh/must-gather-xzb9c" Dec 05 21:02:50 crc kubenswrapper[4885]: I1205 21:02:50.882050 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2b92cdd5-01a2-4e80-b003-4e02f77eb87c-must-gather-output\") pod \"must-gather-xzb9c\" (UID: \"2b92cdd5-01a2-4e80-b003-4e02f77eb87c\") " pod="openshift-must-gather-69vbh/must-gather-xzb9c" Dec 05 21:02:50 crc kubenswrapper[4885]: I1205 21:02:50.898601 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw2z5\" (UniqueName: \"kubernetes.io/projected/2b92cdd5-01a2-4e80-b003-4e02f77eb87c-kube-api-access-bw2z5\") pod \"must-gather-xzb9c\" (UID: \"2b92cdd5-01a2-4e80-b003-4e02f77eb87c\") " pod="openshift-must-gather-69vbh/must-gather-xzb9c" Dec 05 21:02:50 crc kubenswrapper[4885]: I1205 21:02:50.990392 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69vbh/must-gather-xzb9c" Dec 05 21:02:51 crc kubenswrapper[4885]: I1205 21:02:51.463379 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-69vbh/must-gather-xzb9c"] Dec 05 21:02:51 crc kubenswrapper[4885]: I1205 21:02:51.483758 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:02:52 crc kubenswrapper[4885]: I1205 21:02:52.066693 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-69vbh/must-gather-xzb9c" event={"ID":"2b92cdd5-01a2-4e80-b003-4e02f77eb87c","Type":"ContainerStarted","Data":"f94a182c8b9f2b6ba54e171d14454ce008ab6158e7935d93538c72fe2b63c5f8"} Dec 05 21:02:56 crc kubenswrapper[4885]: I1205 21:02:56.098667 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-69vbh/must-gather-xzb9c" event={"ID":"2b92cdd5-01a2-4e80-b003-4e02f77eb87c","Type":"ContainerStarted","Data":"972e2caca43f9be08dca8b516087a460465a1d288b2b471258f5f4c344be60f5"} Dec 05 21:02:56 crc kubenswrapper[4885]: I1205 21:02:56.099211 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-69vbh/must-gather-xzb9c" event={"ID":"2b92cdd5-01a2-4e80-b003-4e02f77eb87c","Type":"ContainerStarted","Data":"26335c4ea7e2fb48690c9de4f67de89ab2508e9979857112345e3b49e3952fad"} Dec 05 21:02:58 crc kubenswrapper[4885]: E1205 21:02:58.611394 4885 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.164:37048->38.102.83.164:43191: write tcp 38.102.83.164:37048->38.102.83.164:43191: write: broken pipe Dec 05 21:02:59 crc kubenswrapper[4885]: I1205 21:02:59.305380 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-69vbh/must-gather-xzb9c" podStartSLOduration=5.5117607060000005 podStartE2EDuration="9.30536253s" podCreationTimestamp="2025-12-05 21:02:50 +0000 UTC" firstStartedPulling="2025-12-05 21:02:51.483699735 +0000 UTC m=+3436.780515396" lastFinishedPulling="2025-12-05 21:02:55.277301549 +0000 UTC m=+3440.574117220" observedRunningTime="2025-12-05 21:02:56.114526763 +0000 UTC m=+3441.411342424" watchObservedRunningTime="2025-12-05 21:02:59.30536253 +0000 UTC m=+3444.602178181" Dec 05 21:02:59 crc kubenswrapper[4885]: I1205 21:02:59.309149 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-69vbh/crc-debug-hb88x"] Dec 05 21:02:59 crc kubenswrapper[4885]: I1205 21:02:59.310231 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69vbh/crc-debug-hb88x" Dec 05 21:02:59 crc kubenswrapper[4885]: I1205 21:02:59.319035 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-69vbh"/"default-dockercfg-hkmhv" Dec 05 21:02:59 crc kubenswrapper[4885]: I1205 21:02:59.432102 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95ppf\" (UniqueName: \"kubernetes.io/projected/d06c3dcb-3681-4758-a5b9-b22356ee72b0-kube-api-access-95ppf\") pod \"crc-debug-hb88x\" (UID: \"d06c3dcb-3681-4758-a5b9-b22356ee72b0\") " pod="openshift-must-gather-69vbh/crc-debug-hb88x" Dec 05 21:02:59 crc kubenswrapper[4885]: I1205 21:02:59.432344 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d06c3dcb-3681-4758-a5b9-b22356ee72b0-host\") pod \"crc-debug-hb88x\" (UID: \"d06c3dcb-3681-4758-a5b9-b22356ee72b0\") " pod="openshift-must-gather-69vbh/crc-debug-hb88x" Dec 05 21:02:59 crc kubenswrapper[4885]: I1205 21:02:59.534399 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d06c3dcb-3681-4758-a5b9-b22356ee72b0-host\") pod \"crc-debug-hb88x\" (UID: \"d06c3dcb-3681-4758-a5b9-b22356ee72b0\") " pod="openshift-must-gather-69vbh/crc-debug-hb88x" Dec 05 21:02:59 crc kubenswrapper[4885]: I1205 21:02:59.534584 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d06c3dcb-3681-4758-a5b9-b22356ee72b0-host\") pod \"crc-debug-hb88x\" (UID: \"d06c3dcb-3681-4758-a5b9-b22356ee72b0\") " pod="openshift-must-gather-69vbh/crc-debug-hb88x" Dec 05 21:02:59 crc kubenswrapper[4885]: I1205 21:02:59.534824 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95ppf\" (UniqueName: \"kubernetes.io/projected/d06c3dcb-3681-4758-a5b9-b22356ee72b0-kube-api-access-95ppf\") pod \"crc-debug-hb88x\" (UID: \"d06c3dcb-3681-4758-a5b9-b22356ee72b0\") " pod="openshift-must-gather-69vbh/crc-debug-hb88x" Dec 05 21:02:59 crc kubenswrapper[4885]: I1205 21:02:59.568749 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95ppf\" (UniqueName: \"kubernetes.io/projected/d06c3dcb-3681-4758-a5b9-b22356ee72b0-kube-api-access-95ppf\") pod \"crc-debug-hb88x\" (UID: \"d06c3dcb-3681-4758-a5b9-b22356ee72b0\") " pod="openshift-must-gather-69vbh/crc-debug-hb88x" Dec 05 21:02:59 crc kubenswrapper[4885]: I1205 21:02:59.635239 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69vbh/crc-debug-hb88x" Dec 05 21:02:59 crc kubenswrapper[4885]: W1205 21:02:59.676750 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd06c3dcb_3681_4758_a5b9_b22356ee72b0.slice/crio-96914547108a03e2a32a8f6cb91d4124371abc7b17043a71fc36378adaefe01d WatchSource:0}: Error finding container 96914547108a03e2a32a8f6cb91d4124371abc7b17043a71fc36378adaefe01d: Status 404 returned error can't find the container with id 96914547108a03e2a32a8f6cb91d4124371abc7b17043a71fc36378adaefe01d Dec 05 21:03:00 crc kubenswrapper[4885]: I1205 21:03:00.132063 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-69vbh/crc-debug-hb88x" event={"ID":"d06c3dcb-3681-4758-a5b9-b22356ee72b0","Type":"ContainerStarted","Data":"96914547108a03e2a32a8f6cb91d4124371abc7b17043a71fc36378adaefe01d"} Dec 05 21:03:12 crc kubenswrapper[4885]: I1205 21:03:12.257577 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-69vbh/crc-debug-hb88x" event={"ID":"d06c3dcb-3681-4758-a5b9-b22356ee72b0","Type":"ContainerStarted","Data":"a384ac356d493bf928123b47f73475cb4580dd254cd1dc948bda92640649b8cb"} Dec 05 21:03:12 crc kubenswrapper[4885]: I1205 21:03:12.274915 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-69vbh/crc-debug-hb88x" podStartSLOduration=1.224958479 podStartE2EDuration="13.274898554s" podCreationTimestamp="2025-12-05 21:02:59 +0000 UTC" firstStartedPulling="2025-12-05 21:02:59.678848752 +0000 UTC m=+3444.975664413" lastFinishedPulling="2025-12-05 21:03:11.728788837 +0000 UTC m=+3457.025604488" observedRunningTime="2025-12-05 21:03:12.274181331 +0000 UTC m=+3457.570996992" watchObservedRunningTime="2025-12-05 21:03:12.274898554 +0000 UTC m=+3457.571714215" Dec 05 21:03:54 crc kubenswrapper[4885]: I1205 21:03:54.687379 4885 generic.go:334] "Generic (PLEG): container finished" podID="d06c3dcb-3681-4758-a5b9-b22356ee72b0" containerID="a384ac356d493bf928123b47f73475cb4580dd254cd1dc948bda92640649b8cb" exitCode=0 Dec 05 21:03:54 crc kubenswrapper[4885]: I1205 21:03:54.687492 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-69vbh/crc-debug-hb88x" event={"ID":"d06c3dcb-3681-4758-a5b9-b22356ee72b0","Type":"ContainerDied","Data":"a384ac356d493bf928123b47f73475cb4580dd254cd1dc948bda92640649b8cb"} Dec 05 21:03:55 crc kubenswrapper[4885]: I1205 21:03:55.846290 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69vbh/crc-debug-hb88x" Dec 05 21:03:55 crc kubenswrapper[4885]: I1205 21:03:55.884117 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-69vbh/crc-debug-hb88x"] Dec 05 21:03:55 crc kubenswrapper[4885]: I1205 21:03:55.901860 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-69vbh/crc-debug-hb88x"] Dec 05 21:03:55 crc kubenswrapper[4885]: I1205 21:03:55.906717 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95ppf\" (UniqueName: \"kubernetes.io/projected/d06c3dcb-3681-4758-a5b9-b22356ee72b0-kube-api-access-95ppf\") pod \"d06c3dcb-3681-4758-a5b9-b22356ee72b0\" (UID: \"d06c3dcb-3681-4758-a5b9-b22356ee72b0\") " Dec 05 21:03:55 crc kubenswrapper[4885]: I1205 21:03:55.906764 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d06c3dcb-3681-4758-a5b9-b22356ee72b0-host\") pod \"d06c3dcb-3681-4758-a5b9-b22356ee72b0\" (UID: \"d06c3dcb-3681-4758-a5b9-b22356ee72b0\") " Dec 05 21:03:55 crc kubenswrapper[4885]: I1205 21:03:55.906978 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d06c3dcb-3681-4758-a5b9-b22356ee72b0-host" (OuterVolumeSpecName: "host") pod "d06c3dcb-3681-4758-a5b9-b22356ee72b0" (UID: "d06c3dcb-3681-4758-a5b9-b22356ee72b0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:03:55 crc kubenswrapper[4885]: I1205 21:03:55.907220 4885 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d06c3dcb-3681-4758-a5b9-b22356ee72b0-host\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:55 crc kubenswrapper[4885]: I1205 21:03:55.913523 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06c3dcb-3681-4758-a5b9-b22356ee72b0-kube-api-access-95ppf" (OuterVolumeSpecName: "kube-api-access-95ppf") pod "d06c3dcb-3681-4758-a5b9-b22356ee72b0" (UID: "d06c3dcb-3681-4758-a5b9-b22356ee72b0"). InnerVolumeSpecName "kube-api-access-95ppf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:03:56 crc kubenswrapper[4885]: I1205 21:03:56.009543 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95ppf\" (UniqueName: \"kubernetes.io/projected/d06c3dcb-3681-4758-a5b9-b22356ee72b0-kube-api-access-95ppf\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:56 crc kubenswrapper[4885]: I1205 21:03:56.715865 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96914547108a03e2a32a8f6cb91d4124371abc7b17043a71fc36378adaefe01d" Dec 05 21:03:56 crc kubenswrapper[4885]: I1205 21:03:56.716065 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69vbh/crc-debug-hb88x" Dec 05 21:03:57 crc kubenswrapper[4885]: I1205 21:03:57.047605 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-69vbh/crc-debug-nz4tr"] Dec 05 21:03:57 crc kubenswrapper[4885]: E1205 21:03:57.048006 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06c3dcb-3681-4758-a5b9-b22356ee72b0" containerName="container-00" Dec 05 21:03:57 crc kubenswrapper[4885]: I1205 21:03:57.048036 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06c3dcb-3681-4758-a5b9-b22356ee72b0" containerName="container-00" Dec 05 21:03:57 crc kubenswrapper[4885]: I1205 21:03:57.048237 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06c3dcb-3681-4758-a5b9-b22356ee72b0" containerName="container-00" Dec 05 21:03:57 crc kubenswrapper[4885]: I1205 21:03:57.048992 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69vbh/crc-debug-nz4tr" Dec 05 21:03:57 crc kubenswrapper[4885]: I1205 21:03:57.052378 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-69vbh"/"default-dockercfg-hkmhv" Dec 05 21:03:57 crc kubenswrapper[4885]: I1205 21:03:57.129992 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgj6w\" (UniqueName: \"kubernetes.io/projected/a0f4f033-33a4-4851-b01a-831d7ad0ae88-kube-api-access-wgj6w\") pod \"crc-debug-nz4tr\" (UID: \"a0f4f033-33a4-4851-b01a-831d7ad0ae88\") " pod="openshift-must-gather-69vbh/crc-debug-nz4tr" Dec 05 21:03:57 crc kubenswrapper[4885]: I1205 21:03:57.130117 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a0f4f033-33a4-4851-b01a-831d7ad0ae88-host\") pod \"crc-debug-nz4tr\" (UID: \"a0f4f033-33a4-4851-b01a-831d7ad0ae88\") " pod="openshift-must-gather-69vbh/crc-debug-nz4tr" Dec 05 21:03:57 crc kubenswrapper[4885]: I1205 21:03:57.184700 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06c3dcb-3681-4758-a5b9-b22356ee72b0" path="/var/lib/kubelet/pods/d06c3dcb-3681-4758-a5b9-b22356ee72b0/volumes" Dec 05 21:03:57 crc kubenswrapper[4885]: I1205 21:03:57.232152 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgj6w\" (UniqueName: \"kubernetes.io/projected/a0f4f033-33a4-4851-b01a-831d7ad0ae88-kube-api-access-wgj6w\") pod \"crc-debug-nz4tr\" (UID: \"a0f4f033-33a4-4851-b01a-831d7ad0ae88\") " pod="openshift-must-gather-69vbh/crc-debug-nz4tr" Dec 05 21:03:57 crc kubenswrapper[4885]: I1205 21:03:57.232224 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a0f4f033-33a4-4851-b01a-831d7ad0ae88-host\") pod \"crc-debug-nz4tr\" (UID: \"a0f4f033-33a4-4851-b01a-831d7ad0ae88\") " pod="openshift-must-gather-69vbh/crc-debug-nz4tr" Dec 05 21:03:57 crc kubenswrapper[4885]: I1205 21:03:57.232284 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a0f4f033-33a4-4851-b01a-831d7ad0ae88-host\") pod \"crc-debug-nz4tr\" (UID: \"a0f4f033-33a4-4851-b01a-831d7ad0ae88\") " pod="openshift-must-gather-69vbh/crc-debug-nz4tr" Dec 05 21:03:57 crc kubenswrapper[4885]: I1205 21:03:57.247966 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgj6w\" (UniqueName: \"kubernetes.io/projected/a0f4f033-33a4-4851-b01a-831d7ad0ae88-kube-api-access-wgj6w\") pod \"crc-debug-nz4tr\" (UID: \"a0f4f033-33a4-4851-b01a-831d7ad0ae88\") " pod="openshift-must-gather-69vbh/crc-debug-nz4tr" Dec 05 21:03:57 crc kubenswrapper[4885]: I1205 21:03:57.382273 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69vbh/crc-debug-nz4tr" Dec 05 21:03:57 crc kubenswrapper[4885]: I1205 21:03:57.724291 4885 generic.go:334] "Generic (PLEG): container finished" podID="a0f4f033-33a4-4851-b01a-831d7ad0ae88" containerID="66400ef7bc92e4d70af61b8fdeafd97cfb0f8b9ac610129d627f1caf1cb65555" exitCode=0 Dec 05 21:03:57 crc kubenswrapper[4885]: I1205 21:03:57.724397 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-69vbh/crc-debug-nz4tr" event={"ID":"a0f4f033-33a4-4851-b01a-831d7ad0ae88","Type":"ContainerDied","Data":"66400ef7bc92e4d70af61b8fdeafd97cfb0f8b9ac610129d627f1caf1cb65555"} Dec 05 21:03:57 crc kubenswrapper[4885]: I1205 21:03:57.724670 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-69vbh/crc-debug-nz4tr" event={"ID":"a0f4f033-33a4-4851-b01a-831d7ad0ae88","Type":"ContainerStarted","Data":"cda860a238454edddac95434a836e1cf2236f90f486d8feb5fa57f2bf8cf6001"} Dec 05 21:03:58 crc kubenswrapper[4885]: I1205 21:03:58.237180 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-69vbh/crc-debug-nz4tr"] Dec 05 21:03:58 crc kubenswrapper[4885]: I1205 21:03:58.245095 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-69vbh/crc-debug-nz4tr"] Dec 05 21:03:58 crc kubenswrapper[4885]: I1205 21:03:58.844694 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69vbh/crc-debug-nz4tr" Dec 05 21:03:58 crc kubenswrapper[4885]: I1205 21:03:58.961384 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgj6w\" (UniqueName: \"kubernetes.io/projected/a0f4f033-33a4-4851-b01a-831d7ad0ae88-kube-api-access-wgj6w\") pod \"a0f4f033-33a4-4851-b01a-831d7ad0ae88\" (UID: \"a0f4f033-33a4-4851-b01a-831d7ad0ae88\") " Dec 05 21:03:58 crc kubenswrapper[4885]: I1205 21:03:58.961561 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a0f4f033-33a4-4851-b01a-831d7ad0ae88-host\") pod \"a0f4f033-33a4-4851-b01a-831d7ad0ae88\" (UID: \"a0f4f033-33a4-4851-b01a-831d7ad0ae88\") " Dec 05 21:03:58 crc kubenswrapper[4885]: I1205 21:03:58.961727 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0f4f033-33a4-4851-b01a-831d7ad0ae88-host" (OuterVolumeSpecName: "host") pod "a0f4f033-33a4-4851-b01a-831d7ad0ae88" (UID: "a0f4f033-33a4-4851-b01a-831d7ad0ae88"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:03:58 crc kubenswrapper[4885]: I1205 21:03:58.962592 4885 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a0f4f033-33a4-4851-b01a-831d7ad0ae88-host\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:58 crc kubenswrapper[4885]: I1205 21:03:58.967298 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f4f033-33a4-4851-b01a-831d7ad0ae88-kube-api-access-wgj6w" (OuterVolumeSpecName: "kube-api-access-wgj6w") pod "a0f4f033-33a4-4851-b01a-831d7ad0ae88" (UID: "a0f4f033-33a4-4851-b01a-831d7ad0ae88"). InnerVolumeSpecName "kube-api-access-wgj6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:03:59 crc kubenswrapper[4885]: I1205 21:03:59.065114 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgj6w\" (UniqueName: \"kubernetes.io/projected/a0f4f033-33a4-4851-b01a-831d7ad0ae88-kube-api-access-wgj6w\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:59 crc kubenswrapper[4885]: I1205 21:03:59.185340 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f4f033-33a4-4851-b01a-831d7ad0ae88" path="/var/lib/kubelet/pods/a0f4f033-33a4-4851-b01a-831d7ad0ae88/volumes" Dec 05 21:03:59 crc kubenswrapper[4885]: I1205 21:03:59.437294 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-69vbh/crc-debug-ssxqp"] Dec 05 21:03:59 crc kubenswrapper[4885]: E1205 21:03:59.438935 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f4f033-33a4-4851-b01a-831d7ad0ae88" containerName="container-00" Dec 05 21:03:59 crc kubenswrapper[4885]: I1205 21:03:59.439042 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f4f033-33a4-4851-b01a-831d7ad0ae88" containerName="container-00" Dec 05 21:03:59 crc kubenswrapper[4885]: I1205 21:03:59.439330 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f4f033-33a4-4851-b01a-831d7ad0ae88" containerName="container-00" Dec 05 21:03:59 crc kubenswrapper[4885]: I1205 21:03:59.439983 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69vbh/crc-debug-ssxqp" Dec 05 21:03:59 crc kubenswrapper[4885]: I1205 21:03:59.576791 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/97c8d32e-9032-4dbe-ba8f-b0726f7242d2-host\") pod \"crc-debug-ssxqp\" (UID: \"97c8d32e-9032-4dbe-ba8f-b0726f7242d2\") " pod="openshift-must-gather-69vbh/crc-debug-ssxqp" Dec 05 21:03:59 crc kubenswrapper[4885]: I1205 21:03:59.576892 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5jfp\" (UniqueName: \"kubernetes.io/projected/97c8d32e-9032-4dbe-ba8f-b0726f7242d2-kube-api-access-z5jfp\") pod \"crc-debug-ssxqp\" (UID: \"97c8d32e-9032-4dbe-ba8f-b0726f7242d2\") " pod="openshift-must-gather-69vbh/crc-debug-ssxqp" Dec 05 21:03:59 crc kubenswrapper[4885]: I1205 21:03:59.679680 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5jfp\" (UniqueName: \"kubernetes.io/projected/97c8d32e-9032-4dbe-ba8f-b0726f7242d2-kube-api-access-z5jfp\") pod \"crc-debug-ssxqp\" (UID: \"97c8d32e-9032-4dbe-ba8f-b0726f7242d2\") " pod="openshift-must-gather-69vbh/crc-debug-ssxqp" Dec 05 21:03:59 crc kubenswrapper[4885]: I1205 21:03:59.679977 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/97c8d32e-9032-4dbe-ba8f-b0726f7242d2-host\") pod \"crc-debug-ssxqp\" (UID: \"97c8d32e-9032-4dbe-ba8f-b0726f7242d2\") " pod="openshift-must-gather-69vbh/crc-debug-ssxqp" Dec 05 21:03:59 crc kubenswrapper[4885]: I1205 21:03:59.680282 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/97c8d32e-9032-4dbe-ba8f-b0726f7242d2-host\") pod \"crc-debug-ssxqp\" (UID: \"97c8d32e-9032-4dbe-ba8f-b0726f7242d2\") " pod="openshift-must-gather-69vbh/crc-debug-ssxqp" Dec 05 21:03:59 crc kubenswrapper[4885]: I1205 21:03:59.708757 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5jfp\" (UniqueName: \"kubernetes.io/projected/97c8d32e-9032-4dbe-ba8f-b0726f7242d2-kube-api-access-z5jfp\") pod \"crc-debug-ssxqp\" (UID: \"97c8d32e-9032-4dbe-ba8f-b0726f7242d2\") " pod="openshift-must-gather-69vbh/crc-debug-ssxqp" Dec 05 21:03:59 crc kubenswrapper[4885]: I1205 21:03:59.749494 4885 scope.go:117] "RemoveContainer" containerID="66400ef7bc92e4d70af61b8fdeafd97cfb0f8b9ac610129d627f1caf1cb65555" Dec 05 21:03:59 crc kubenswrapper[4885]: I1205 21:03:59.749559 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69vbh/crc-debug-nz4tr" Dec 05 21:03:59 crc kubenswrapper[4885]: I1205 21:03:59.766936 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69vbh/crc-debug-ssxqp" Dec 05 21:03:59 crc kubenswrapper[4885]: W1205 21:03:59.838521 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97c8d32e_9032_4dbe_ba8f_b0726f7242d2.slice/crio-975359bc4d7a68feeac2c56c0b0553b389937478105ce69ee33f0d183d64c8f4 WatchSource:0}: Error finding container 975359bc4d7a68feeac2c56c0b0553b389937478105ce69ee33f0d183d64c8f4: Status 404 returned error can't find the container with id 975359bc4d7a68feeac2c56c0b0553b389937478105ce69ee33f0d183d64c8f4 Dec 05 21:04:00 crc kubenswrapper[4885]: I1205 21:04:00.766850 4885 generic.go:334] "Generic (PLEG): container finished" podID="97c8d32e-9032-4dbe-ba8f-b0726f7242d2" containerID="d85b6c5928aecc1641c4da11ecc2e8346748b70318e0cf186e7f23eafeb66fd9" exitCode=0 Dec 05 21:04:00 crc kubenswrapper[4885]: I1205 21:04:00.766924 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-69vbh/crc-debug-ssxqp" event={"ID":"97c8d32e-9032-4dbe-ba8f-b0726f7242d2","Type":"ContainerDied","Data":"d85b6c5928aecc1641c4da11ecc2e8346748b70318e0cf186e7f23eafeb66fd9"} Dec 05 21:04:00 crc kubenswrapper[4885]: I1205 21:04:00.767496 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-69vbh/crc-debug-ssxqp" event={"ID":"97c8d32e-9032-4dbe-ba8f-b0726f7242d2","Type":"ContainerStarted","Data":"975359bc4d7a68feeac2c56c0b0553b389937478105ce69ee33f0d183d64c8f4"} Dec 05 21:04:00 crc kubenswrapper[4885]: I1205 21:04:00.821640 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-69vbh/crc-debug-ssxqp"] Dec 05 21:04:00 crc kubenswrapper[4885]: I1205 21:04:00.834255 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-69vbh/crc-debug-ssxqp"] Dec 05 21:04:01 crc kubenswrapper[4885]: I1205 21:04:01.882567 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69vbh/crc-debug-ssxqp" Dec 05 21:04:02 crc kubenswrapper[4885]: I1205 21:04:02.026545 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5jfp\" (UniqueName: \"kubernetes.io/projected/97c8d32e-9032-4dbe-ba8f-b0726f7242d2-kube-api-access-z5jfp\") pod \"97c8d32e-9032-4dbe-ba8f-b0726f7242d2\" (UID: \"97c8d32e-9032-4dbe-ba8f-b0726f7242d2\") " Dec 05 21:04:02 crc kubenswrapper[4885]: I1205 21:04:02.026631 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/97c8d32e-9032-4dbe-ba8f-b0726f7242d2-host\") pod \"97c8d32e-9032-4dbe-ba8f-b0726f7242d2\" (UID: \"97c8d32e-9032-4dbe-ba8f-b0726f7242d2\") " Dec 05 21:04:02 crc kubenswrapper[4885]: I1205 21:04:02.026984 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97c8d32e-9032-4dbe-ba8f-b0726f7242d2-host" (OuterVolumeSpecName: "host") pod "97c8d32e-9032-4dbe-ba8f-b0726f7242d2" (UID: "97c8d32e-9032-4dbe-ba8f-b0726f7242d2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:04:02 crc kubenswrapper[4885]: I1205 21:04:02.027124 4885 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/97c8d32e-9032-4dbe-ba8f-b0726f7242d2-host\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:02 crc kubenswrapper[4885]: I1205 21:04:02.032179 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c8d32e-9032-4dbe-ba8f-b0726f7242d2-kube-api-access-z5jfp" (OuterVolumeSpecName: "kube-api-access-z5jfp") pod "97c8d32e-9032-4dbe-ba8f-b0726f7242d2" (UID: "97c8d32e-9032-4dbe-ba8f-b0726f7242d2"). InnerVolumeSpecName "kube-api-access-z5jfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:04:02 crc kubenswrapper[4885]: I1205 21:04:02.128672 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5jfp\" (UniqueName: \"kubernetes.io/projected/97c8d32e-9032-4dbe-ba8f-b0726f7242d2-kube-api-access-z5jfp\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:02 crc kubenswrapper[4885]: I1205 21:04:02.785542 4885 scope.go:117] "RemoveContainer" containerID="d85b6c5928aecc1641c4da11ecc2e8346748b70318e0cf186e7f23eafeb66fd9" Dec 05 21:04:02 crc kubenswrapper[4885]: I1205 21:04:02.785639 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69vbh/crc-debug-ssxqp" Dec 05 21:04:03 crc kubenswrapper[4885]: I1205 21:04:03.183310 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97c8d32e-9032-4dbe-ba8f-b0726f7242d2" path="/var/lib/kubelet/pods/97c8d32e-9032-4dbe-ba8f-b0726f7242d2/volumes" Dec 05 21:04:17 crc kubenswrapper[4885]: I1205 21:04:17.895876 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-787956bb96-gzkln_8af09c24-8a48-47cc-ad7c-1778f9a27547/barbican-api/0.log" Dec 05 21:04:18 crc kubenswrapper[4885]: I1205 21:04:18.129774 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-787956bb96-gzkln_8af09c24-8a48-47cc-ad7c-1778f9a27547/barbican-api-log/0.log" Dec 05 21:04:18 crc kubenswrapper[4885]: I1205 21:04:18.153836 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f44776d88-2k4qb_84964967-0f37-47c8-919f-3a68040a1d36/barbican-keystone-listener/0.log" Dec 05 21:04:18 crc kubenswrapper[4885]: I1205 21:04:18.205457 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f44776d88-2k4qb_84964967-0f37-47c8-919f-3a68040a1d36/barbican-keystone-listener-log/0.log" Dec 05 21:04:18 crc kubenswrapper[4885]: I1205 21:04:18.319282 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-549d6dd897-jd542_f10197ca-8886-4668-b3e8-1179bdb7041d/barbican-worker/0.log" Dec 05 21:04:18 crc kubenswrapper[4885]: I1205 21:04:18.333487 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-549d6dd897-jd542_f10197ca-8886-4668-b3e8-1179bdb7041d/barbican-worker-log/0.log" Dec 05 21:04:18 crc kubenswrapper[4885]: I1205 21:04:18.496854 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d_54bae71b-4af1-49b5-a41b-58e6aafd26ca/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:04:18 crc kubenswrapper[4885]: I1205 21:04:18.520466 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a72398e-830b-402b-83c9-4ea93aa05c76/ceilometer-central-agent/0.log" Dec 05 21:04:18 crc kubenswrapper[4885]: I1205 21:04:18.648278 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a72398e-830b-402b-83c9-4ea93aa05c76/ceilometer-notification-agent/0.log" Dec 05 21:04:18 crc kubenswrapper[4885]: I1205 21:04:18.699657 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a72398e-830b-402b-83c9-4ea93aa05c76/proxy-httpd/0.log" Dec 05 21:04:18 crc kubenswrapper[4885]: I1205 21:04:18.701318 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a72398e-830b-402b-83c9-4ea93aa05c76/sg-core/0.log" Dec 05 21:04:18 crc kubenswrapper[4885]: I1205 21:04:18.863836 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_232e06c4-ecaf-4959-b1e2-0c183f6afb64/cinder-api/0.log" Dec 05 21:04:18 crc kubenswrapper[4885]: I1205 21:04:18.894272 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_232e06c4-ecaf-4959-b1e2-0c183f6afb64/cinder-api-log/0.log" Dec 05 21:04:18 crc kubenswrapper[4885]: I1205 21:04:18.993026 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_85ff2041-1a3f-46c9-ba86-9440a4c1e129/cinder-scheduler/0.log" Dec 05 21:04:19 crc kubenswrapper[4885]: I1205 21:04:19.081690 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_85ff2041-1a3f-46c9-ba86-9440a4c1e129/probe/0.log" Dec 05 21:04:19 crc kubenswrapper[4885]: I1205 21:04:19.140862 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-nvh26_cf7e7e25-a243-4caf-8b1a-34c1830a097e/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:04:19 crc kubenswrapper[4885]: I1205 21:04:19.272840 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-klnxp_9487fa66-920b-41fc-beb6-4dffcb4a898a/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:04:19 crc kubenswrapper[4885]: I1205 21:04:19.341792 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78f49d79c7-7qk6g_2bb6d6a7-1ca1-4089-91e9-f8641f2f262e/init/0.log" Dec 05 21:04:19 crc kubenswrapper[4885]: I1205 21:04:19.521546 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78f49d79c7-7qk6g_2bb6d6a7-1ca1-4089-91e9-f8641f2f262e/init/0.log" Dec 05 21:04:19 crc kubenswrapper[4885]: I1205 21:04:19.546282 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78f49d79c7-7qk6g_2bb6d6a7-1ca1-4089-91e9-f8641f2f262e/dnsmasq-dns/0.log" Dec 05 21:04:19 crc kubenswrapper[4885]: I1205 21:04:19.613652 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9_a16820a2-be4e-45d6-bcef-91810571b95f/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:04:19 crc kubenswrapper[4885]: I1205 21:04:19.749090 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c88a6c22-ae9a-4d43-9a63-e6ea351eb012/glance-httpd/0.log" Dec 05 21:04:19 crc kubenswrapper[4885]: I1205 21:04:19.801185 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c88a6c22-ae9a-4d43-9a63-e6ea351eb012/glance-log/0.log" Dec 05 21:04:19 crc kubenswrapper[4885]: I1205 21:04:19.921391 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d/glance-httpd/0.log" Dec 05 21:04:19 crc kubenswrapper[4885]: I1205 21:04:19.996352 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d/glance-log/0.log" Dec 05 21:04:20 crc kubenswrapper[4885]: I1205 21:04:20.117911 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7d9999949d-c22ch_d0f84b71-1907-4f71-833d-1e5561a4f0f8/horizon/0.log" Dec 05 21:04:20 crc kubenswrapper[4885]: I1205 21:04:20.282687 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-pqprh_9c9ed39f-ee5e-4c66-8171-488ed01847db/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:04:20 crc kubenswrapper[4885]: I1205 21:04:20.431460 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7d9999949d-c22ch_d0f84b71-1907-4f71-833d-1e5561a4f0f8/horizon-log/0.log" Dec 05 21:04:20 crc kubenswrapper[4885]: I1205 21:04:20.504237 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-d6c5z_d0a9ab2d-1012-41ba-b810-c7f7f127330e/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:04:20 crc kubenswrapper[4885]: I1205 21:04:20.696937 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7bdf6f4c4b-9n2vm_a8ffb925-d20c-4c24-a3b2-158d9c347b6b/keystone-api/0.log" Dec 05 21:04:20 crc kubenswrapper[4885]: I1205 21:04:20.765043 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29416141-4x6cq_39c99e0c-b27c-4703-a5c0-a380c33df665/keystone-cron/0.log" Dec 05 21:04:20 crc kubenswrapper[4885]: I1205 21:04:20.827657 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_34d68d6f-5309-4dd5-b361-811ddff64379/kube-state-metrics/0.log" Dec 05 21:04:20 crc kubenswrapper[4885]: I1205 21:04:20.957971 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt_7b51c87e-b603-43e2-bb06-a8e9a0416a59/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:04:21 crc kubenswrapper[4885]: I1205 21:04:21.503585 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-94b44cc8f-5tpnj_0437ab7b-cd9d-46e8-9bca-7acdbefda1be/neutron-httpd/0.log" Dec 05 21:04:21 crc kubenswrapper[4885]: I1205 21:04:21.506133 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-94b44cc8f-5tpnj_0437ab7b-cd9d-46e8-9bca-7acdbefda1be/neutron-api/0.log" Dec 05 21:04:21 crc kubenswrapper[4885]: I1205 21:04:21.715566 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d_525d9ebb-07fb-41b7-9059-d609ed9cac0e/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:04:22 crc kubenswrapper[4885]: I1205 21:04:22.130893 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_1e275487-025f-4c31-a7f4-267b05218da9/nova-api-log/0.log" Dec 05 21:04:22 crc kubenswrapper[4885]: I1205 21:04:22.199401 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b8f6973a-6753-4845-a273-798f031cf4d6/nova-cell0-conductor-conductor/0.log" Dec 05 21:04:22 crc kubenswrapper[4885]: I1205 21:04:22.342905 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_1e275487-025f-4c31-a7f4-267b05218da9/nova-api-api/0.log" Dec 05 21:04:22 crc kubenswrapper[4885]: I1205 21:04:22.394965 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_0d7ef835-7090-43c0-b489-8e1adc41fd47/nova-cell1-conductor-conductor/0.log" Dec 05 21:04:22 crc kubenswrapper[4885]: I1205 21:04:22.493771 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d5627a8a-d602-4c23-bb2f-e07f9c2a8681/nova-cell1-novncproxy-novncproxy/0.log" Dec 05 21:04:22 crc kubenswrapper[4885]: I1205 21:04:22.727479 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-9j89h_453597ee-fc9f-4fc6-beb2-e4c75e1236db/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:04:22 crc kubenswrapper[4885]: I1205 21:04:22.820684 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_41069d3f-c9d5-4278-8171-cebf5434937e/nova-metadata-log/0.log" Dec 05 21:04:23 crc kubenswrapper[4885]: I1205 21:04:23.108139 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_0d49d4cd-955c-41c7-8df0-63b364cb3e2d/nova-scheduler-scheduler/0.log" Dec 05 21:04:23 crc kubenswrapper[4885]: I1205 21:04:23.183618 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_93184776-73bf-4ff3-9f7f-66b46fd511ed/mysql-bootstrap/0.log" Dec 05 21:04:23 crc kubenswrapper[4885]: I1205 21:04:23.333420 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_93184776-73bf-4ff3-9f7f-66b46fd511ed/mysql-bootstrap/0.log" Dec 05 21:04:23 crc kubenswrapper[4885]: I1205 21:04:23.354964 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_93184776-73bf-4ff3-9f7f-66b46fd511ed/galera/0.log" Dec 05 21:04:23 crc kubenswrapper[4885]: I1205 21:04:23.579672 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3e1a8619-8184-43c1-9444-8e86fbc4213d/mysql-bootstrap/0.log" Dec 05 21:04:23 crc kubenswrapper[4885]: I1205 21:04:23.724554 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3e1a8619-8184-43c1-9444-8e86fbc4213d/mysql-bootstrap/0.log" Dec 05 21:04:23 crc kubenswrapper[4885]: I1205 21:04:23.729128 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3e1a8619-8184-43c1-9444-8e86fbc4213d/galera/0.log" Dec 05 21:04:23 crc kubenswrapper[4885]: I1205 21:04:23.961126 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_60b132f9-5036-44cd-8d19-e60a39760da0/openstackclient/0.log" Dec 05 21:04:23 crc kubenswrapper[4885]: I1205 21:04:23.983139 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_41069d3f-c9d5-4278-8171-cebf5434937e/nova-metadata-metadata/0.log" Dec 05 21:04:24 crc kubenswrapper[4885]: I1205 21:04:24.114705 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-z7wfg_28451893-15ed-4dc1-a6ef-f93fed27316e/openstack-network-exporter/0.log" Dec 05 21:04:24 crc kubenswrapper[4885]: I1205 21:04:24.254811 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hgth4_32c5b9a2-f65e-4223-ac3f-f49a4e160454/ovsdb-server-init/0.log" Dec 05 21:04:24 crc kubenswrapper[4885]: I1205 21:04:24.441186 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hgth4_32c5b9a2-f65e-4223-ac3f-f49a4e160454/ovsdb-server-init/0.log" Dec 05 21:04:24 crc kubenswrapper[4885]: I1205 21:04:24.455497 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hgth4_32c5b9a2-f65e-4223-ac3f-f49a4e160454/ovs-vswitchd/0.log" Dec 05 21:04:24 crc kubenswrapper[4885]: I1205 21:04:24.494709 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hgth4_32c5b9a2-f65e-4223-ac3f-f49a4e160454/ovsdb-server/0.log" Dec 05 21:04:24 crc kubenswrapper[4885]: I1205 21:04:24.735862 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ptwvl_0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99/ovn-controller/0.log" Dec 05 21:04:24 crc kubenswrapper[4885]: I1205 21:04:24.776380 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-m548j_de5ebae2-9fe8-4b8a-ab85-60226fa56525/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:04:24 crc kubenswrapper[4885]: I1205 21:04:24.957603 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2cf8581f-1009-4a26-9642-4e154e83dbc1/openstack-network-exporter/0.log" Dec 05 21:04:25 crc kubenswrapper[4885]: I1205 21:04:25.015647 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2cf8581f-1009-4a26-9642-4e154e83dbc1/ovn-northd/0.log" Dec 05 21:04:25 crc kubenswrapper[4885]: I1205 21:04:25.059498 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_91d9cbb0-7966-411b-86e4-b80882da454e/openstack-network-exporter/0.log" Dec 05 21:04:25 crc kubenswrapper[4885]: I1205 21:04:25.166734 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_91d9cbb0-7966-411b-86e4-b80882da454e/ovsdbserver-nb/0.log" Dec 05 21:04:25 crc kubenswrapper[4885]: I1205 21:04:25.281576 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7edaf8ab-283b-46bc-89e2-a3c8f681624b/openstack-network-exporter/0.log" Dec 05 21:04:25 crc kubenswrapper[4885]: I1205 21:04:25.487884 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7edaf8ab-283b-46bc-89e2-a3c8f681624b/ovsdbserver-sb/0.log" Dec 05 21:04:25 crc kubenswrapper[4885]: I1205 21:04:25.616467 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d98fd5798-8jhxf_eca7ccc4-d1ff-402c-9fe8-0c61746d41d1/placement-api/0.log" Dec 05 21:04:25 crc kubenswrapper[4885]: I1205 21:04:25.750119 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d98fd5798-8jhxf_eca7ccc4-d1ff-402c-9fe8-0c61746d41d1/placement-log/0.log" Dec 05 21:04:25 crc kubenswrapper[4885]: I1205 21:04:25.810831 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_38cec51a-a7b6-420f-8efe-f21b3acf2f3f/setup-container/0.log" Dec 05 21:04:26 crc kubenswrapper[4885]: I1205 21:04:26.016317 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_38cec51a-a7b6-420f-8efe-f21b3acf2f3f/setup-container/0.log" Dec 05 21:04:26 crc kubenswrapper[4885]: I1205 21:04:26.045336 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_38cec51a-a7b6-420f-8efe-f21b3acf2f3f/rabbitmq/0.log" Dec 05 21:04:26 crc kubenswrapper[4885]: I1205 21:04:26.089361 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cdc87c63-a124-485c-8f34-016d17a58f29/setup-container/0.log" Dec 05 21:04:26 crc kubenswrapper[4885]: I1205 21:04:26.330538 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cdc87c63-a124-485c-8f34-016d17a58f29/setup-container/0.log" Dec 05 21:04:26 crc kubenswrapper[4885]: I1205 21:04:26.410001 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cdc87c63-a124-485c-8f34-016d17a58f29/rabbitmq/0.log" Dec 05 21:04:26 crc kubenswrapper[4885]: I1205 21:04:26.471170 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk_b27a1f4c-ba65-4b22-885a-e642064f7c27/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:04:26 crc kubenswrapper[4885]: I1205 21:04:26.641158 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-fdzlz_a40c582a-e811-4e60-a7fe-1bf467d32e96/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:04:26 crc kubenswrapper[4885]: I1205 21:04:26.722130 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92_489dbc8e-e2ca-41aa-9e48-ca81bea02758/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:04:26 crc kubenswrapper[4885]: I1205 21:04:26.827455 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-ptcp8_59678b29-6ffe-4d18-a8bb-8bf4717f9b10/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:04:26 crc kubenswrapper[4885]: I1205 21:04:26.928533 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9jdng_8f98cb4a-349f-443b-aab3-686a3d0bcc67/ssh-known-hosts-edpm-deployment/0.log" Dec 05 21:04:27 crc kubenswrapper[4885]: I1205 21:04:27.116689 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-56b6f678f7-nt7kq_5df6ff8a-e66c-402d-a7cd-63125b9c6cae/proxy-server/0.log" Dec 05 21:04:27 crc kubenswrapper[4885]: I1205 21:04:27.183054 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-56b6f678f7-nt7kq_5df6ff8a-e66c-402d-a7cd-63125b9c6cae/proxy-httpd/0.log" Dec 05 21:04:27 crc kubenswrapper[4885]: I1205 21:04:27.279514 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-2j6cb_c5c452f6-0d03-4e67-bab0-0dcb1926f523/swift-ring-rebalance/0.log" Dec 05 21:04:27 crc kubenswrapper[4885]: I1205 21:04:27.397273 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/account-auditor/0.log" Dec 05 21:04:27 crc kubenswrapper[4885]: I1205 21:04:27.405718 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/account-reaper/0.log" Dec 05 21:04:27 crc kubenswrapper[4885]: I1205 21:04:27.516173 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/account-replicator/0.log" Dec 05 21:04:27 crc kubenswrapper[4885]: I1205 21:04:27.634364 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/container-auditor/0.log" Dec 05 21:04:27 crc kubenswrapper[4885]: I1205 21:04:27.635327 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/account-server/0.log" Dec 05 21:04:27 crc kubenswrapper[4885]: I1205 21:04:27.681602 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/container-replicator/0.log" Dec 05 21:04:27 crc kubenswrapper[4885]: I1205 21:04:27.734270 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/container-server/0.log" Dec 05 21:04:27 crc kubenswrapper[4885]: I1205 21:04:27.826909 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/container-updater/0.log" Dec 05 21:04:27 crc kubenswrapper[4885]: I1205 21:04:27.926180 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/object-expirer/0.log" Dec 05 21:04:27 crc kubenswrapper[4885]: I1205 21:04:27.947695 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/object-auditor/0.log" Dec 05 21:04:27 crc kubenswrapper[4885]: I1205 21:04:27.986630 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/object-replicator/0.log" Dec 05 21:04:28 crc kubenswrapper[4885]: I1205 21:04:28.071127 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/object-server/0.log" Dec 05 21:04:28 crc kubenswrapper[4885]: I1205 21:04:28.134898 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/object-updater/0.log" Dec 05 21:04:28 crc kubenswrapper[4885]: I1205 21:04:28.142746 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/rsync/0.log" Dec 05 21:04:28 crc kubenswrapper[4885]: I1205 21:04:28.202698 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/swift-recon-cron/0.log" Dec 05 21:04:28 crc kubenswrapper[4885]: I1205 21:04:28.363572 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-5m26m_d6e72054-a861-40ce-b2c9-6212896baaf4/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:04:28 crc kubenswrapper[4885]: I1205 21:04:28.435934 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d/tempest-tests-tempest-tests-runner/0.log" Dec 05 21:04:28 crc kubenswrapper[4885]: I1205 21:04:28.638837 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_8faf97ea-3453-4e96-8a29-a7a30aec54c1/test-operator-logs-container/0.log" Dec 05 21:04:28 crc kubenswrapper[4885]: I1205 21:04:28.682389 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q_f6fcaa99-97aa-46d8-be19-5cac454e2f77/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:04:34 crc kubenswrapper[4885]: I1205 21:04:34.209318 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_12c40607-2770-4b97-95f1-6ac26280d337/memcached/0.log" Dec 05 21:04:46 crc kubenswrapper[4885]: I1205 21:04:46.631634 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:04:46 crc kubenswrapper[4885]: I1205 21:04:46.632145 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:04:53 crc kubenswrapper[4885]: I1205 21:04:53.384882 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c_9339b513-f7aa-4ad6-9e87-b585e81c0577/util/0.log" Dec 05 21:04:53 crc kubenswrapper[4885]: I1205 21:04:53.560973 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c_9339b513-f7aa-4ad6-9e87-b585e81c0577/pull/0.log" Dec 05 21:04:53 crc kubenswrapper[4885]: I1205 21:04:53.625391 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c_9339b513-f7aa-4ad6-9e87-b585e81c0577/pull/0.log" Dec 05 21:04:53 crc kubenswrapper[4885]: I1205 21:04:53.638390 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c_9339b513-f7aa-4ad6-9e87-b585e81c0577/util/0.log" Dec 05 21:04:53 crc kubenswrapper[4885]: I1205 21:04:53.807732 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c_9339b513-f7aa-4ad6-9e87-b585e81c0577/pull/0.log" Dec 05 21:04:53 crc kubenswrapper[4885]: I1205 21:04:53.814769 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c_9339b513-f7aa-4ad6-9e87-b585e81c0577/extract/0.log" Dec 05 21:04:53 crc kubenswrapper[4885]: I1205 21:04:53.848864 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c_9339b513-f7aa-4ad6-9e87-b585e81c0577/util/0.log" Dec 05 21:04:54 crc kubenswrapper[4885]: I1205 21:04:54.032108 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-cqj46_74869c39-a4c4-4812-8656-4751d25ef987/kube-rbac-proxy/0.log" Dec 05 21:04:54 crc kubenswrapper[4885]: I1205 21:04:54.052621 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-cqj46_74869c39-a4c4-4812-8656-4751d25ef987/manager/0.log" Dec 05 21:04:54 crc kubenswrapper[4885]: I1205 21:04:54.057289 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-s4ftd_93741f1b-6823-4374-927f-38d95ba139f5/kube-rbac-proxy/0.log" Dec 05 21:04:54 crc kubenswrapper[4885]: I1205 21:04:54.214457 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-s4ftd_93741f1b-6823-4374-927f-38d95ba139f5/manager/0.log" Dec 05 21:04:54 crc kubenswrapper[4885]: I1205 21:04:54.266356 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-nqshj_6a0f526a-c496-478e-bc4c-e6478ebeb3ea/kube-rbac-proxy/0.log" Dec 05 21:04:54 crc kubenswrapper[4885]: I1205 21:04:54.288171 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-nqshj_6a0f526a-c496-478e-bc4c-e6478ebeb3ea/manager/0.log" Dec 05 21:04:54 crc kubenswrapper[4885]: I1205 21:04:54.601841 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-kgdg2_9034e951-dbbb-4927-b9fa-fa2e83c1595c/kube-rbac-proxy/0.log" Dec 05 21:04:54 crc kubenswrapper[4885]: I1205 21:04:54.669359 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-rqh2l_c942221f-6ad2-4109-9975-ec8054686283/kube-rbac-proxy/0.log" Dec 05 21:04:54 crc kubenswrapper[4885]: I1205 21:04:54.743614 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-rqh2l_c942221f-6ad2-4109-9975-ec8054686283/manager/0.log" Dec 05 21:04:54 crc kubenswrapper[4885]: I1205 21:04:54.744314 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-kgdg2_9034e951-dbbb-4927-b9fa-fa2e83c1595c/manager/0.log" Dec 05 21:04:54 crc kubenswrapper[4885]: I1205 21:04:54.888451 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-zz7df_ee66e99c-4761-43a5-a55c-b28957859913/kube-rbac-proxy/0.log" Dec 05 21:04:54 crc kubenswrapper[4885]: I1205 21:04:54.952882 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-zz7df_ee66e99c-4761-43a5-a55c-b28957859913/manager/0.log" Dec 05 21:04:55 crc kubenswrapper[4885]: I1205 21:04:55.095638 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-dpqcg_f9775930-6d69-4ad4-a249-f5d2f270b365/kube-rbac-proxy/0.log" Dec 05 21:04:55 crc kubenswrapper[4885]: I1205 21:04:55.218189 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-z27c2_741c1713-f931-471e-ad95-99d16600ab76/kube-rbac-proxy/0.log" Dec 05 21:04:55 crc kubenswrapper[4885]: I1205 21:04:55.275868 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-dpqcg_f9775930-6d69-4ad4-a249-f5d2f270b365/manager/0.log" Dec 05 21:04:55 crc kubenswrapper[4885]: I1205 21:04:55.345051 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-z27c2_741c1713-f931-471e-ad95-99d16600ab76/manager/0.log" Dec 05 21:04:55 crc kubenswrapper[4885]: I1205 21:04:55.413318 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-r6ljq_da47cf7f-37ab-4d5d-99b1-1b312002f83e/kube-rbac-proxy/0.log" Dec 05 21:04:55 crc kubenswrapper[4885]: I1205 21:04:55.520634 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-r6ljq_da47cf7f-37ab-4d5d-99b1-1b312002f83e/manager/0.log" Dec 05 21:04:55 crc kubenswrapper[4885]: I1205 21:04:55.813579 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-4vb99_ca2be922-afb3-4640-bdad-cfd3b0164d52/kube-rbac-proxy/0.log" Dec 05 21:04:55 crc kubenswrapper[4885]: I1205 21:04:55.878990 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-4vb99_ca2be922-afb3-4640-bdad-cfd3b0164d52/manager/0.log" Dec 05 21:04:55 crc kubenswrapper[4885]: I1205 21:04:55.920720 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-hkw2j_e12a10c6-f52c-4348-bb54-356af7632dd4/kube-rbac-proxy/0.log" Dec 05 21:04:56 crc kubenswrapper[4885]: I1205 21:04:56.008798 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-hkw2j_e12a10c6-f52c-4348-bb54-356af7632dd4/manager/0.log" Dec 05 21:04:56 crc kubenswrapper[4885]: I1205 21:04:56.092304 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-z4wtk_33f07e6f-9ac8-461d-b455-ad634c2e255c/kube-rbac-proxy/0.log" Dec 05 21:04:56 crc kubenswrapper[4885]: I1205 21:04:56.171194 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-z4wtk_33f07e6f-9ac8-461d-b455-ad634c2e255c/manager/0.log" Dec 05 21:04:56 crc kubenswrapper[4885]: I1205 21:04:56.244195 4885 scope.go:117] "RemoveContainer" containerID="156b470e15308146a5fdef022bb45a21f778d4c80647bf2359b958770be83580" Dec 05 21:04:56 crc kubenswrapper[4885]: I1205 21:04:56.272718 4885 scope.go:117] "RemoveContainer" containerID="eee84cfc97bfa3c97d02087ae6b5068035cd3498d97b9c18708b4faf0133f871" Dec 05 21:04:56 crc kubenswrapper[4885]: I1205 21:04:56.311401 4885 scope.go:117] "RemoveContainer" containerID="12bfc5f92007d88d06ec1c917c72d181adbbb3308db9c386a907bf5031ecd153" Dec 05 21:04:56 crc kubenswrapper[4885]: I1205 21:04:56.365552 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-w5c5m_3e2eaf31-e16e-4072-ae6b-a5c9eda46732/kube-rbac-proxy/0.log" Dec 05 21:04:56 crc kubenswrapper[4885]: I1205 21:04:56.431090 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-w5c5m_3e2eaf31-e16e-4072-ae6b-a5c9eda46732/manager/0.log" Dec 05 21:04:56 crc kubenswrapper[4885]: I1205 21:04:56.453473 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-gwtxz_aed37ead-6406-43f0-a6f5-4e8864935a58/kube-rbac-proxy/0.log" Dec 05 21:04:56 crc kubenswrapper[4885]: I1205 21:04:56.621121 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-gwtxz_aed37ead-6406-43f0-a6f5-4e8864935a58/manager/0.log" Dec 05 21:04:56 crc kubenswrapper[4885]: I1205 21:04:56.638528 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f5qfqlm_fdb3c987-9d79-4920-9b95-1be3a3dbc622/kube-rbac-proxy/0.log" Dec 05 21:04:56 crc kubenswrapper[4885]: I1205 21:04:56.641308 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f5qfqlm_fdb3c987-9d79-4920-9b95-1be3a3dbc622/manager/0.log" Dec 05 21:04:57 crc kubenswrapper[4885]: I1205 21:04:57.126769 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-jm7lc_0f1ef804-3daa-44e0-a978-f6edc8efab00/registry-server/0.log" Dec 05 21:04:57 crc kubenswrapper[4885]: I1205 21:04:57.298999 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-55b6fb9447-qk2s7_15ce450d-0098-4b25-afd2-5bda05cfb5b0/operator/0.log" Dec 05 21:04:57 crc kubenswrapper[4885]: I1205 21:04:57.315555 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-t4mch_06e1a4eb-c6cb-4146-b2f9-484c2e699a7e/kube-rbac-proxy/0.log" Dec 05 21:04:57 crc kubenswrapper[4885]: I1205 21:04:57.342637 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-t4mch_06e1a4eb-c6cb-4146-b2f9-484c2e699a7e/manager/0.log" Dec 05 21:04:57 crc kubenswrapper[4885]: I1205 21:04:57.496823 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4q2vd_2eea8037-d11c-47ee-9bc9-67deafc20268/kube-rbac-proxy/0.log" Dec 05 21:04:57 crc kubenswrapper[4885]: I1205 21:04:57.595685 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-qpp7t_18cedf03-5e88-4513-b2cc-e364e749f219/operator/0.log" Dec 05 21:04:57 crc kubenswrapper[4885]: I1205 21:04:57.629410 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4q2vd_2eea8037-d11c-47ee-9bc9-67deafc20268/manager/0.log" Dec 05 21:04:57 crc kubenswrapper[4885]: I1205 21:04:57.778792 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-t4xtt_c20bdf47-2333-40eb-b5e1-4ad4ad32cdd5/kube-rbac-proxy/0.log" Dec 05 21:04:57 crc kubenswrapper[4885]: I1205 21:04:57.863238 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-t4xtt_c20bdf47-2333-40eb-b5e1-4ad4ad32cdd5/manager/0.log" Dec 05 21:04:57 crc kubenswrapper[4885]: I1205 21:04:57.898813 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-rqs2p_f68526b5-c6b6-484e-b476-1e4c76ba71fd/kube-rbac-proxy/0.log" Dec 05 21:04:58 crc kubenswrapper[4885]: I1205 21:04:58.090186 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-565xh_49b39782-af0e-4f86-89f4-96582b6a8336/kube-rbac-proxy/0.log" Dec 05 21:04:58 crc kubenswrapper[4885]: I1205 21:04:58.105915 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-565xh_49b39782-af0e-4f86-89f4-96582b6a8336/manager/0.log" Dec 05 21:04:58 crc kubenswrapper[4885]: I1205 21:04:58.122748 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-rqs2p_f68526b5-c6b6-484e-b476-1e4c76ba71fd/manager/0.log" Dec 05 21:04:58 crc kubenswrapper[4885]: I1205 21:04:58.192804 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54bdf956c4-b47j2_acaad339-be87-48ab-aee8-7f4637190768/manager/0.log" Dec 05 21:04:58 crc kubenswrapper[4885]: I1205 21:04:58.269376 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-nrtkv_f9ccfa3f-a548-4e32-9318-b3f2cb19ccca/kube-rbac-proxy/0.log" Dec 05 21:04:58 crc kubenswrapper[4885]: I1205 21:04:58.321783 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-nrtkv_f9ccfa3f-a548-4e32-9318-b3f2cb19ccca/manager/0.log" Dec 05 21:05:16 crc kubenswrapper[4885]: I1205 21:05:16.630633 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:05:16 crc kubenswrapper[4885]: I1205 21:05:16.631230 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:05:17 crc kubenswrapper[4885]: I1205 21:05:17.864440 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hfsls_1ad3cb2f-89ef-4f6e-9d48-f3eb33e4581c/control-plane-machine-set-operator/0.log" Dec 05 21:05:17 crc kubenswrapper[4885]: I1205 21:05:17.947176 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vs7jr_24653880-7b0f-4174-ac74-5d13d99975e9/kube-rbac-proxy/0.log" Dec 05 21:05:18 crc kubenswrapper[4885]: I1205 21:05:18.030143 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vs7jr_24653880-7b0f-4174-ac74-5d13d99975e9/machine-api-operator/0.log" Dec 05 21:05:29 crc kubenswrapper[4885]: I1205 21:05:29.723236 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-z8hk7_c7c60e10-72a8-4031-8e22-2f7b2ccc720c/cert-manager-controller/0.log" Dec 05 21:05:30 crc kubenswrapper[4885]: I1205 21:05:29.981798 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-4swqf_21e6c715-7d1f-405a-9d66-8ac102a2e623/cert-manager-webhook/0.log" Dec 05 21:05:30 crc kubenswrapper[4885]: I1205 21:05:29.985818 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-6th47_c3ccb845-eaa6-44fd-b7ea-4f3739516528/cert-manager-cainjector/0.log" Dec 05 21:05:41 crc kubenswrapper[4885]: I1205 21:05:41.073813 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-vgfwc_72454e30-d40f-408d-93f6-c0cf1ce2f400/nmstate-console-plugin/0.log" Dec 05 21:05:41 crc kubenswrapper[4885]: I1205 21:05:41.164259 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-ndhsp_912fc0d4-121a-4073-9e85-a2277a5078d8/nmstate-handler/0.log" Dec 05 21:05:41 crc kubenswrapper[4885]: I1205 21:05:41.246879 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-lhpld_7e345d16-e7f9-4881-a031-eb5ef37e22b3/kube-rbac-proxy/0.log" Dec 05 21:05:41 crc kubenswrapper[4885]: I1205 21:05:41.259441 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-lhpld_7e345d16-e7f9-4881-a031-eb5ef37e22b3/nmstate-metrics/0.log" Dec 05 21:05:41 crc kubenswrapper[4885]: I1205 21:05:41.404871 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-p8qxg_5275a59b-4935-4ce8-8552-ed28f0377be5/nmstate-operator/0.log" Dec 05 21:05:41 crc kubenswrapper[4885]: I1205 21:05:41.452986 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-ph4g7_7001b6ac-1126-4d81-9148-47e6f7f830c1/nmstate-webhook/0.log" Dec 05 21:05:46 crc kubenswrapper[4885]: I1205 21:05:46.631454 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:05:46 crc kubenswrapper[4885]: I1205 21:05:46.632134 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:05:46 crc kubenswrapper[4885]: I1205 21:05:46.632216 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 21:05:46 crc kubenswrapper[4885]: I1205 21:05:46.633287 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847"} pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:05:46 crc kubenswrapper[4885]: I1205 21:05:46.633377 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" containerID="cri-o://fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" gracePeriod=600 Dec 05 21:05:46 crc kubenswrapper[4885]: E1205 21:05:46.755053 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:05:47 crc kubenswrapper[4885]: I1205 21:05:47.672892 4885 generic.go:334] "Generic (PLEG): container finished" podID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" exitCode=0 Dec 05 21:05:47 crc kubenswrapper[4885]: I1205 21:05:47.673041 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerDied","Data":"fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847"} Dec 05 21:05:47 crc kubenswrapper[4885]: I1205 21:05:47.673242 4885 scope.go:117] "RemoveContainer" containerID="55f15dab397240767972c0a0157905fc91315a34d8e0fb0cfbfb80eaa3e064ff" Dec 05 21:05:47 crc kubenswrapper[4885]: I1205 21:05:47.673845 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:05:47 crc kubenswrapper[4885]: E1205 21:05:47.674175 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:05:54 crc kubenswrapper[4885]: I1205 21:05:54.664397 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-gwwj5_61bdac93-6e5a-4b95-a146-ea0874dc5962/kube-rbac-proxy/0.log" Dec 05 21:05:54 crc kubenswrapper[4885]: I1205 21:05:54.741381 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-gwwj5_61bdac93-6e5a-4b95-a146-ea0874dc5962/controller/0.log" Dec 05 21:05:54 crc kubenswrapper[4885]: I1205 21:05:54.873442 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-frr-files/0.log" Dec 05 21:05:55 crc kubenswrapper[4885]: I1205 21:05:55.078858 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-reloader/0.log" Dec 05 21:05:55 crc kubenswrapper[4885]: I1205 21:05:55.103122 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-metrics/0.log" Dec 05 21:05:55 crc kubenswrapper[4885]: I1205 21:05:55.103323 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-frr-files/0.log" Dec 05 21:05:55 crc kubenswrapper[4885]: I1205 21:05:55.143215 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-reloader/0.log" Dec 05 21:05:55 crc kubenswrapper[4885]: I1205 21:05:55.321938 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-frr-files/0.log" Dec 05 21:05:55 crc kubenswrapper[4885]: I1205 21:05:55.322654 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-reloader/0.log" Dec 05 21:05:55 crc kubenswrapper[4885]: I1205 21:05:55.338752 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-metrics/0.log" Dec 05 21:05:55 crc kubenswrapper[4885]: I1205 21:05:55.348833 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-metrics/0.log" Dec 05 21:05:55 crc kubenswrapper[4885]: I1205 21:05:55.539910 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-reloader/0.log" Dec 05 21:05:55 crc kubenswrapper[4885]: I1205 21:05:55.569461 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-metrics/0.log" Dec 05 21:05:55 crc kubenswrapper[4885]: I1205 21:05:55.572168 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-frr-files/0.log" Dec 05 21:05:55 crc kubenswrapper[4885]: I1205 21:05:55.576587 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/controller/0.log" Dec 05 21:05:55 crc kubenswrapper[4885]: I1205 21:05:55.757955 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/frr-metrics/0.log" Dec 05 21:05:55 crc kubenswrapper[4885]: I1205 21:05:55.772431 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/kube-rbac-proxy-frr/0.log" Dec 05 21:05:55 crc kubenswrapper[4885]: I1205 21:05:55.775964 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/kube-rbac-proxy/0.log" Dec 05 21:05:55 crc kubenswrapper[4885]: I1205 21:05:55.994163 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-p9slq_a1b920f0-0596-43ef-b94b-d3035f0e5e1c/frr-k8s-webhook-server/0.log" Dec 05 21:05:56 crc kubenswrapper[4885]: I1205 21:05:56.012730 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/reloader/0.log" Dec 05 21:05:56 crc kubenswrapper[4885]: I1205 21:05:56.294667 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-fb9f8748-k8dk7_dd4c62d1-80af-4d61-bc04-6ac5c8259121/manager/0.log" Dec 05 21:05:56 crc kubenswrapper[4885]: I1205 21:05:56.582994 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5dcf889d57-wtshh_8263fedc-0c2a-4de8-8d5c-47aa32b745ee/webhook-server/0.log" Dec 05 21:05:56 crc kubenswrapper[4885]: I1205 21:05:56.637397 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5jq2d_2fa36864-508b-488b-8830-d60337213cca/kube-rbac-proxy/0.log" Dec 05 21:05:57 crc kubenswrapper[4885]: I1205 21:05:57.017790 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/frr/0.log" Dec 05 21:05:57 crc kubenswrapper[4885]: I1205 21:05:57.149875 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5jq2d_2fa36864-508b-488b-8830-d60337213cca/speaker/0.log" Dec 05 21:06:02 crc kubenswrapper[4885]: I1205 21:06:02.172921 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:06:02 crc kubenswrapper[4885]: E1205 21:06:02.174104 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:06:09 crc kubenswrapper[4885]: I1205 21:06:09.017003 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8_b4c07e66-01e1-4851-92f0-2e498a2f04bf/util/0.log" Dec 05 21:06:09 crc kubenswrapper[4885]: I1205 21:06:09.153831 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8_b4c07e66-01e1-4851-92f0-2e498a2f04bf/util/0.log" Dec 05 21:06:09 crc kubenswrapper[4885]: I1205 21:06:09.188101 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8_b4c07e66-01e1-4851-92f0-2e498a2f04bf/pull/0.log" Dec 05 21:06:09 crc kubenswrapper[4885]: I1205 21:06:09.209449 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8_b4c07e66-01e1-4851-92f0-2e498a2f04bf/pull/0.log" Dec 05 21:06:09 crc kubenswrapper[4885]: I1205 21:06:09.355293 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8_b4c07e66-01e1-4851-92f0-2e498a2f04bf/util/0.log" Dec 05 21:06:09 crc kubenswrapper[4885]: I1205 21:06:09.389470 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8_b4c07e66-01e1-4851-92f0-2e498a2f04bf/pull/0.log" Dec 05 21:06:09 crc kubenswrapper[4885]: I1205 21:06:09.397556 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8_b4c07e66-01e1-4851-92f0-2e498a2f04bf/extract/0.log" Dec 05 21:06:09 crc kubenswrapper[4885]: I1205 21:06:09.512367 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m_2799bcd8-694a-4fdc-b243-2780761ecda7/util/0.log" Dec 05 21:06:09 crc kubenswrapper[4885]: I1205 21:06:09.688889 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m_2799bcd8-694a-4fdc-b243-2780761ecda7/pull/0.log" Dec 05 21:06:09 crc kubenswrapper[4885]: I1205 21:06:09.729672 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m_2799bcd8-694a-4fdc-b243-2780761ecda7/pull/0.log" Dec 05 21:06:09 crc kubenswrapper[4885]: I1205 21:06:09.737237 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m_2799bcd8-694a-4fdc-b243-2780761ecda7/util/0.log" Dec 05 21:06:09 crc kubenswrapper[4885]: I1205 21:06:09.900549 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m_2799bcd8-694a-4fdc-b243-2780761ecda7/util/0.log" Dec 05 21:06:09 crc kubenswrapper[4885]: I1205 21:06:09.907638 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m_2799bcd8-694a-4fdc-b243-2780761ecda7/extract/0.log" Dec 05 21:06:09 crc kubenswrapper[4885]: I1205 21:06:09.914200 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m_2799bcd8-694a-4fdc-b243-2780761ecda7/pull/0.log" Dec 05 21:06:10 crc kubenswrapper[4885]: I1205 21:06:10.069728 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-65plm_f0443767-ff82-48a9-8fc4-c981ebe6ebac/extract-utilities/0.log" Dec 05 21:06:10 crc kubenswrapper[4885]: I1205 21:06:10.206568 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-65plm_f0443767-ff82-48a9-8fc4-c981ebe6ebac/extract-content/0.log" Dec 05 21:06:10 crc kubenswrapper[4885]: I1205 21:06:10.223199 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-65plm_f0443767-ff82-48a9-8fc4-c981ebe6ebac/extract-utilities/0.log" Dec 05 21:06:10 crc kubenswrapper[4885]: I1205 21:06:10.251875 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-65plm_f0443767-ff82-48a9-8fc4-c981ebe6ebac/extract-content/0.log" Dec 05 21:06:10 crc kubenswrapper[4885]: I1205 21:06:10.371689 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-65plm_f0443767-ff82-48a9-8fc4-c981ebe6ebac/extract-content/0.log" Dec 05 21:06:10 crc kubenswrapper[4885]: I1205 21:06:10.410095 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-65plm_f0443767-ff82-48a9-8fc4-c981ebe6ebac/extract-utilities/0.log" Dec 05 21:06:10 crc kubenswrapper[4885]: I1205 21:06:10.588091 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwl8v_f2250db3-b5b2-435f-bd9e-1b599f663d70/extract-utilities/0.log" Dec 05 21:06:10 crc kubenswrapper[4885]: I1205 21:06:10.752809 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwl8v_f2250db3-b5b2-435f-bd9e-1b599f663d70/extract-content/0.log" Dec 05 21:06:10 crc kubenswrapper[4885]: I1205 21:06:10.758633 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwl8v_f2250db3-b5b2-435f-bd9e-1b599f663d70/extract-utilities/0.log" Dec 05 21:06:10 crc kubenswrapper[4885]: I1205 21:06:10.783254 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwl8v_f2250db3-b5b2-435f-bd9e-1b599f663d70/extract-content/0.log" Dec 05 21:06:10 crc kubenswrapper[4885]: I1205 21:06:10.921353 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwl8v_f2250db3-b5b2-435f-bd9e-1b599f663d70/extract-utilities/0.log" Dec 05 21:06:10 crc kubenswrapper[4885]: I1205 21:06:10.938494 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwl8v_f2250db3-b5b2-435f-bd9e-1b599f663d70/extract-content/0.log" Dec 05 21:06:11 crc kubenswrapper[4885]: I1205 21:06:11.145637 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-65plm_f0443767-ff82-48a9-8fc4-c981ebe6ebac/registry-server/0.log" Dec 05 21:06:11 crc kubenswrapper[4885]: I1205 21:06:11.171342 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-djpjw_b7708f77-d399-4d7e-8034-9e043e56aabe/marketplace-operator/0.log" Dec 05 21:06:11 crc kubenswrapper[4885]: I1205 21:06:11.330262 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnv28_fd9a5eba-660b-489b-b9f8-3a5366d313c9/extract-utilities/0.log" Dec 05 21:06:11 crc kubenswrapper[4885]: I1205 21:06:11.601244 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnv28_fd9a5eba-660b-489b-b9f8-3a5366d313c9/extract-utilities/0.log" Dec 05 21:06:11 crc kubenswrapper[4885]: I1205 21:06:11.623124 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwl8v_f2250db3-b5b2-435f-bd9e-1b599f663d70/registry-server/0.log" Dec 05 21:06:11 crc kubenswrapper[4885]: I1205 21:06:11.668458 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnv28_fd9a5eba-660b-489b-b9f8-3a5366d313c9/extract-content/0.log" Dec 05 21:06:11 crc kubenswrapper[4885]: I1205 21:06:11.675847 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnv28_fd9a5eba-660b-489b-b9f8-3a5366d313c9/extract-content/0.log" Dec 05 21:06:11 crc kubenswrapper[4885]: I1205 21:06:11.864188 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnv28_fd9a5eba-660b-489b-b9f8-3a5366d313c9/extract-utilities/0.log" Dec 05 21:06:11 crc kubenswrapper[4885]: I1205 21:06:11.864877 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnv28_fd9a5eba-660b-489b-b9f8-3a5366d313c9/extract-content/0.log" Dec 05 21:06:11 crc kubenswrapper[4885]: I1205 21:06:11.985751 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnv28_fd9a5eba-660b-489b-b9f8-3a5366d313c9/registry-server/0.log" Dec 05 21:06:12 crc kubenswrapper[4885]: I1205 21:06:12.038829 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2wmcd_d1cc6544-7046-414f-9f36-71801abdfe03/extract-utilities/0.log" Dec 05 21:06:12 crc kubenswrapper[4885]: I1205 21:06:12.234343 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2wmcd_d1cc6544-7046-414f-9f36-71801abdfe03/extract-content/0.log" Dec 05 21:06:12 crc kubenswrapper[4885]: I1205 21:06:12.236435 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2wmcd_d1cc6544-7046-414f-9f36-71801abdfe03/extract-content/0.log" Dec 05 21:06:12 crc kubenswrapper[4885]: I1205 21:06:12.257612 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2wmcd_d1cc6544-7046-414f-9f36-71801abdfe03/extract-utilities/0.log" Dec 05 21:06:12 crc kubenswrapper[4885]: I1205 21:06:12.394623 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2wmcd_d1cc6544-7046-414f-9f36-71801abdfe03/extract-utilities/0.log" Dec 05 21:06:12 crc kubenswrapper[4885]: I1205 21:06:12.395898 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2wmcd_d1cc6544-7046-414f-9f36-71801abdfe03/extract-content/0.log" Dec 05 21:06:12 crc kubenswrapper[4885]: I1205 21:06:12.962119 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2wmcd_d1cc6544-7046-414f-9f36-71801abdfe03/registry-server/0.log" Dec 05 21:06:15 crc kubenswrapper[4885]: I1205 21:06:15.180222 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:06:15 crc kubenswrapper[4885]: E1205 21:06:15.181002 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:06:27 crc kubenswrapper[4885]: I1205 21:06:27.173846 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:06:27 crc kubenswrapper[4885]: E1205 21:06:27.178208 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:06:42 crc kubenswrapper[4885]: I1205 21:06:42.173345 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:06:42 crc kubenswrapper[4885]: E1205 21:06:42.174209 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:06:57 crc kubenswrapper[4885]: I1205 21:06:57.182638 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:06:57 crc kubenswrapper[4885]: E1205 21:06:57.206905 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:07:10 crc kubenswrapper[4885]: I1205 21:07:10.172547 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:07:10 crc kubenswrapper[4885]: E1205 21:07:10.173494 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:07:24 crc kubenswrapper[4885]: I1205 21:07:24.173245 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:07:24 crc kubenswrapper[4885]: E1205 21:07:24.174215 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:07:38 crc kubenswrapper[4885]: I1205 21:07:38.173342 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:07:38 crc kubenswrapper[4885]: E1205 21:07:38.174412 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:07:50 crc kubenswrapper[4885]: I1205 21:07:50.814997 4885 generic.go:334] "Generic (PLEG): container finished" podID="2b92cdd5-01a2-4e80-b003-4e02f77eb87c" containerID="26335c4ea7e2fb48690c9de4f67de89ab2508e9979857112345e3b49e3952fad" exitCode=0 Dec 05 21:07:50 crc kubenswrapper[4885]: I1205 21:07:50.815247 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-69vbh/must-gather-xzb9c" event={"ID":"2b92cdd5-01a2-4e80-b003-4e02f77eb87c","Type":"ContainerDied","Data":"26335c4ea7e2fb48690c9de4f67de89ab2508e9979857112345e3b49e3952fad"} Dec 05 21:07:50 crc kubenswrapper[4885]: I1205 21:07:50.816863 4885 scope.go:117] "RemoveContainer" containerID="26335c4ea7e2fb48690c9de4f67de89ab2508e9979857112345e3b49e3952fad" Dec 05 21:07:50 crc kubenswrapper[4885]: I1205 21:07:50.912102 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-69vbh_must-gather-xzb9c_2b92cdd5-01a2-4e80-b003-4e02f77eb87c/gather/0.log" Dec 05 21:07:51 crc kubenswrapper[4885]: I1205 21:07:51.172664 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:07:51 crc kubenswrapper[4885]: E1205 21:07:51.172940 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:07:58 crc kubenswrapper[4885]: I1205 21:07:58.803239 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-69vbh/must-gather-xzb9c"] Dec 05 21:07:58 crc kubenswrapper[4885]: I1205 21:07:58.803866 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-69vbh/must-gather-xzb9c" podUID="2b92cdd5-01a2-4e80-b003-4e02f77eb87c" containerName="copy" containerID="cri-o://972e2caca43f9be08dca8b516087a460465a1d288b2b471258f5f4c344be60f5" gracePeriod=2 Dec 05 21:07:58 crc kubenswrapper[4885]: I1205 21:07:58.815175 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-69vbh/must-gather-xzb9c"] Dec 05 21:07:59 crc kubenswrapper[4885]: I1205 21:07:59.294991 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-69vbh_must-gather-xzb9c_2b92cdd5-01a2-4e80-b003-4e02f77eb87c/copy/0.log" Dec 05 21:07:59 crc kubenswrapper[4885]: I1205 21:07:59.295722 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69vbh/must-gather-xzb9c" Dec 05 21:07:59 crc kubenswrapper[4885]: I1205 21:07:59.377722 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2b92cdd5-01a2-4e80-b003-4e02f77eb87c-must-gather-output\") pod \"2b92cdd5-01a2-4e80-b003-4e02f77eb87c\" (UID: \"2b92cdd5-01a2-4e80-b003-4e02f77eb87c\") " Dec 05 21:07:59 crc kubenswrapper[4885]: I1205 21:07:59.377863 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw2z5\" (UniqueName: \"kubernetes.io/projected/2b92cdd5-01a2-4e80-b003-4e02f77eb87c-kube-api-access-bw2z5\") pod \"2b92cdd5-01a2-4e80-b003-4e02f77eb87c\" (UID: \"2b92cdd5-01a2-4e80-b003-4e02f77eb87c\") " Dec 05 21:07:59 crc kubenswrapper[4885]: I1205 21:07:59.384718 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b92cdd5-01a2-4e80-b003-4e02f77eb87c-kube-api-access-bw2z5" (OuterVolumeSpecName: "kube-api-access-bw2z5") pod "2b92cdd5-01a2-4e80-b003-4e02f77eb87c" (UID: "2b92cdd5-01a2-4e80-b003-4e02f77eb87c"). InnerVolumeSpecName "kube-api-access-bw2z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:07:59 crc kubenswrapper[4885]: I1205 21:07:59.481124 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw2z5\" (UniqueName: \"kubernetes.io/projected/2b92cdd5-01a2-4e80-b003-4e02f77eb87c-kube-api-access-bw2z5\") on node \"crc\" DevicePath \"\"" Dec 05 21:07:59 crc kubenswrapper[4885]: I1205 21:07:59.579253 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b92cdd5-01a2-4e80-b003-4e02f77eb87c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2b92cdd5-01a2-4e80-b003-4e02f77eb87c" (UID: "2b92cdd5-01a2-4e80-b003-4e02f77eb87c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:07:59 crc kubenswrapper[4885]: I1205 21:07:59.584143 4885 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2b92cdd5-01a2-4e80-b003-4e02f77eb87c-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 05 21:07:59 crc kubenswrapper[4885]: I1205 21:07:59.915194 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-69vbh_must-gather-xzb9c_2b92cdd5-01a2-4e80-b003-4e02f77eb87c/copy/0.log" Dec 05 21:07:59 crc kubenswrapper[4885]: I1205 21:07:59.915727 4885 generic.go:334] "Generic (PLEG): container finished" podID="2b92cdd5-01a2-4e80-b003-4e02f77eb87c" containerID="972e2caca43f9be08dca8b516087a460465a1d288b2b471258f5f4c344be60f5" exitCode=143 Dec 05 21:07:59 crc kubenswrapper[4885]: I1205 21:07:59.915771 4885 scope.go:117] "RemoveContainer" containerID="972e2caca43f9be08dca8b516087a460465a1d288b2b471258f5f4c344be60f5" Dec 05 21:07:59 crc kubenswrapper[4885]: I1205 21:07:59.915813 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69vbh/must-gather-xzb9c" Dec 05 21:07:59 crc kubenswrapper[4885]: I1205 21:07:59.938405 4885 scope.go:117] "RemoveContainer" containerID="26335c4ea7e2fb48690c9de4f67de89ab2508e9979857112345e3b49e3952fad" Dec 05 21:08:00 crc kubenswrapper[4885]: I1205 21:08:00.017358 4885 scope.go:117] "RemoveContainer" containerID="972e2caca43f9be08dca8b516087a460465a1d288b2b471258f5f4c344be60f5" Dec 05 21:08:00 crc kubenswrapper[4885]: E1205 21:08:00.017842 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"972e2caca43f9be08dca8b516087a460465a1d288b2b471258f5f4c344be60f5\": container with ID starting with 972e2caca43f9be08dca8b516087a460465a1d288b2b471258f5f4c344be60f5 not found: ID does not exist" containerID="972e2caca43f9be08dca8b516087a460465a1d288b2b471258f5f4c344be60f5" Dec 05 21:08:00 crc kubenswrapper[4885]: I1205 21:08:00.017873 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972e2caca43f9be08dca8b516087a460465a1d288b2b471258f5f4c344be60f5"} err="failed to get container status \"972e2caca43f9be08dca8b516087a460465a1d288b2b471258f5f4c344be60f5\": rpc error: code = NotFound desc = could not find container \"972e2caca43f9be08dca8b516087a460465a1d288b2b471258f5f4c344be60f5\": container with ID starting with 972e2caca43f9be08dca8b516087a460465a1d288b2b471258f5f4c344be60f5 not found: ID does not exist" Dec 05 21:08:00 crc kubenswrapper[4885]: I1205 21:08:00.017895 4885 scope.go:117] "RemoveContainer" containerID="26335c4ea7e2fb48690c9de4f67de89ab2508e9979857112345e3b49e3952fad" Dec 05 21:08:00 crc kubenswrapper[4885]: E1205 21:08:00.018417 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26335c4ea7e2fb48690c9de4f67de89ab2508e9979857112345e3b49e3952fad\": container with ID starting with 26335c4ea7e2fb48690c9de4f67de89ab2508e9979857112345e3b49e3952fad not found: ID does not exist" containerID="26335c4ea7e2fb48690c9de4f67de89ab2508e9979857112345e3b49e3952fad" Dec 05 21:08:00 crc kubenswrapper[4885]: I1205 21:08:00.018484 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26335c4ea7e2fb48690c9de4f67de89ab2508e9979857112345e3b49e3952fad"} err="failed to get container status \"26335c4ea7e2fb48690c9de4f67de89ab2508e9979857112345e3b49e3952fad\": rpc error: code = NotFound desc = could not find container \"26335c4ea7e2fb48690c9de4f67de89ab2508e9979857112345e3b49e3952fad\": container with ID starting with 26335c4ea7e2fb48690c9de4f67de89ab2508e9979857112345e3b49e3952fad not found: ID does not exist" Dec 05 21:08:01 crc kubenswrapper[4885]: I1205 21:08:01.190925 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b92cdd5-01a2-4e80-b003-4e02f77eb87c" path="/var/lib/kubelet/pods/2b92cdd5-01a2-4e80-b003-4e02f77eb87c/volumes" Dec 05 21:08:03 crc kubenswrapper[4885]: I1205 21:08:03.173300 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:08:03 crc kubenswrapper[4885]: E1205 21:08:03.173960 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:08:16 crc kubenswrapper[4885]: I1205 21:08:16.174007 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:08:16 crc kubenswrapper[4885]: E1205 21:08:16.174958 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:08:30 crc kubenswrapper[4885]: I1205 21:08:30.172820 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:08:30 crc kubenswrapper[4885]: E1205 21:08:30.173683 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.172512 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:08:43 crc kubenswrapper[4885]: E1205 21:08:43.175988 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.431420 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9fz7m"] Dec 05 21:08:43 crc kubenswrapper[4885]: E1205 21:08:43.431802 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b92cdd5-01a2-4e80-b003-4e02f77eb87c" containerName="copy" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.431817 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b92cdd5-01a2-4e80-b003-4e02f77eb87c" containerName="copy" Dec 05 21:08:43 crc kubenswrapper[4885]: E1205 21:08:43.431848 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b92cdd5-01a2-4e80-b003-4e02f77eb87c" containerName="gather" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.431854 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b92cdd5-01a2-4e80-b003-4e02f77eb87c" containerName="gather" Dec 05 21:08:43 crc kubenswrapper[4885]: E1205 21:08:43.431874 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c8d32e-9032-4dbe-ba8f-b0726f7242d2" containerName="container-00" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.431886 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c8d32e-9032-4dbe-ba8f-b0726f7242d2" containerName="container-00" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.432076 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b92cdd5-01a2-4e80-b003-4e02f77eb87c" containerName="gather" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.432101 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b92cdd5-01a2-4e80-b003-4e02f77eb87c" containerName="copy" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.432111 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c8d32e-9032-4dbe-ba8f-b0726f7242d2" containerName="container-00" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.433425 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.440105 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9fz7m"] Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.592277 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e900e882-c31d-4c69-a675-852c562f3c3a-catalog-content\") pod \"community-operators-9fz7m\" (UID: \"e900e882-c31d-4c69-a675-852c562f3c3a\") " pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.592318 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e900e882-c31d-4c69-a675-852c562f3c3a-utilities\") pod \"community-operators-9fz7m\" (UID: \"e900e882-c31d-4c69-a675-852c562f3c3a\") " pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.592436 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvsqn\" (UniqueName: \"kubernetes.io/projected/e900e882-c31d-4c69-a675-852c562f3c3a-kube-api-access-kvsqn\") pod \"community-operators-9fz7m\" (UID: \"e900e882-c31d-4c69-a675-852c562f3c3a\") " pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.694072 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvsqn\" (UniqueName: \"kubernetes.io/projected/e900e882-c31d-4c69-a675-852c562f3c3a-kube-api-access-kvsqn\") pod \"community-operators-9fz7m\" (UID: \"e900e882-c31d-4c69-a675-852c562f3c3a\") " pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.694187 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e900e882-c31d-4c69-a675-852c562f3c3a-catalog-content\") pod \"community-operators-9fz7m\" (UID: \"e900e882-c31d-4c69-a675-852c562f3c3a\") " pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.694212 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e900e882-c31d-4c69-a675-852c562f3c3a-utilities\") pod \"community-operators-9fz7m\" (UID: \"e900e882-c31d-4c69-a675-852c562f3c3a\") " pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.694836 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e900e882-c31d-4c69-a675-852c562f3c3a-catalog-content\") pod \"community-operators-9fz7m\" (UID: \"e900e882-c31d-4c69-a675-852c562f3c3a\") " pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.694842 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e900e882-c31d-4c69-a675-852c562f3c3a-utilities\") pod \"community-operators-9fz7m\" (UID: \"e900e882-c31d-4c69-a675-852c562f3c3a\") " pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.713431 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvsqn\" (UniqueName: \"kubernetes.io/projected/e900e882-c31d-4c69-a675-852c562f3c3a-kube-api-access-kvsqn\") pod \"community-operators-9fz7m\" (UID: \"e900e882-c31d-4c69-a675-852c562f3c3a\") " pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:08:43 crc kubenswrapper[4885]: I1205 21:08:43.811779 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:08:44 crc kubenswrapper[4885]: I1205 21:08:44.342792 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9fz7m"] Dec 05 21:08:44 crc kubenswrapper[4885]: I1205 21:08:44.381660 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fz7m" event={"ID":"e900e882-c31d-4c69-a675-852c562f3c3a","Type":"ContainerStarted","Data":"8fa5bfe5d9b7ea42d9a3cabe1a2fb90f812fd80fedc15d8abbb809b2545a29e0"} Dec 05 21:08:45 crc kubenswrapper[4885]: I1205 21:08:45.397546 4885 generic.go:334] "Generic (PLEG): container finished" podID="e900e882-c31d-4c69-a675-852c562f3c3a" containerID="462517a1eab0ae05bae6884d67064aa77fbcd8e785fab2c4ebc2fa81b0989b5f" exitCode=0 Dec 05 21:08:45 crc kubenswrapper[4885]: I1205 21:08:45.397628 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fz7m" event={"ID":"e900e882-c31d-4c69-a675-852c562f3c3a","Type":"ContainerDied","Data":"462517a1eab0ae05bae6884d67064aa77fbcd8e785fab2c4ebc2fa81b0989b5f"} Dec 05 21:08:45 crc kubenswrapper[4885]: I1205 21:08:45.401001 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:08:48 crc kubenswrapper[4885]: I1205 21:08:48.184949 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tcwnz"] Dec 05 21:08:48 crc kubenswrapper[4885]: I1205 21:08:48.188144 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:08:48 crc kubenswrapper[4885]: I1205 21:08:48.199742 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tcwnz"] Dec 05 21:08:48 crc kubenswrapper[4885]: I1205 21:08:48.295052 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-catalog-content\") pod \"certified-operators-tcwnz\" (UID: \"7b9e654d-cb0c-4f31-b798-ddb0da1753a0\") " pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:08:48 crc kubenswrapper[4885]: I1205 21:08:48.295099 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-utilities\") pod \"certified-operators-tcwnz\" (UID: \"7b9e654d-cb0c-4f31-b798-ddb0da1753a0\") " pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:08:48 crc kubenswrapper[4885]: I1205 21:08:48.295142 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp8zx\" (UniqueName: \"kubernetes.io/projected/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-kube-api-access-qp8zx\") pod \"certified-operators-tcwnz\" (UID: \"7b9e654d-cb0c-4f31-b798-ddb0da1753a0\") " pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:08:48 crc kubenswrapper[4885]: I1205 21:08:48.396715 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-catalog-content\") pod \"certified-operators-tcwnz\" (UID: \"7b9e654d-cb0c-4f31-b798-ddb0da1753a0\") " pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:08:48 crc kubenswrapper[4885]: I1205 21:08:48.397238 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-utilities\") pod \"certified-operators-tcwnz\" (UID: \"7b9e654d-cb0c-4f31-b798-ddb0da1753a0\") " pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:08:48 crc kubenswrapper[4885]: I1205 21:08:48.397278 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp8zx\" (UniqueName: \"kubernetes.io/projected/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-kube-api-access-qp8zx\") pod \"certified-operators-tcwnz\" (UID: \"7b9e654d-cb0c-4f31-b798-ddb0da1753a0\") " pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:08:48 crc kubenswrapper[4885]: I1205 21:08:48.397186 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-catalog-content\") pod \"certified-operators-tcwnz\" (UID: \"7b9e654d-cb0c-4f31-b798-ddb0da1753a0\") " pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:08:48 crc kubenswrapper[4885]: I1205 21:08:48.397855 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-utilities\") pod \"certified-operators-tcwnz\" (UID: \"7b9e654d-cb0c-4f31-b798-ddb0da1753a0\") " pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:08:48 crc kubenswrapper[4885]: I1205 21:08:48.417741 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp8zx\" (UniqueName: \"kubernetes.io/projected/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-kube-api-access-qp8zx\") pod \"certified-operators-tcwnz\" (UID: \"7b9e654d-cb0c-4f31-b798-ddb0da1753a0\") " pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:08:48 crc kubenswrapper[4885]: I1205 21:08:48.425121 4885 generic.go:334] "Generic (PLEG): container finished" podID="e900e882-c31d-4c69-a675-852c562f3c3a" containerID="cf851447ac6d82376900b22895ca61e59079d3cf92453ae9930415b694d4ef58" exitCode=0 Dec 05 21:08:48 crc kubenswrapper[4885]: I1205 21:08:48.425307 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fz7m" event={"ID":"e900e882-c31d-4c69-a675-852c562f3c3a","Type":"ContainerDied","Data":"cf851447ac6d82376900b22895ca61e59079d3cf92453ae9930415b694d4ef58"} Dec 05 21:08:48 crc kubenswrapper[4885]: I1205 21:08:48.506165 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:08:49 crc kubenswrapper[4885]: I1205 21:08:49.042924 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tcwnz"] Dec 05 21:08:49 crc kubenswrapper[4885]: W1205 21:08:49.049249 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b9e654d_cb0c_4f31_b798_ddb0da1753a0.slice/crio-89d4b1e8dea13aad74df74a15e6e47cbfa81b9d2284a7d42cd0549fd69a9c774 WatchSource:0}: Error finding container 89d4b1e8dea13aad74df74a15e6e47cbfa81b9d2284a7d42cd0549fd69a9c774: Status 404 returned error can't find the container with id 89d4b1e8dea13aad74df74a15e6e47cbfa81b9d2284a7d42cd0549fd69a9c774 Dec 05 21:08:49 crc kubenswrapper[4885]: I1205 21:08:49.439568 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tcwnz" event={"ID":"7b9e654d-cb0c-4f31-b798-ddb0da1753a0","Type":"ContainerStarted","Data":"e9b267300c4132af03f21f1c1872d13e0563ad3adb8ab59e6cbac91b59e96d63"} Dec 05 21:08:49 crc kubenswrapper[4885]: I1205 21:08:49.439882 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tcwnz" event={"ID":"7b9e654d-cb0c-4f31-b798-ddb0da1753a0","Type":"ContainerStarted","Data":"89d4b1e8dea13aad74df74a15e6e47cbfa81b9d2284a7d42cd0549fd69a9c774"} Dec 05 21:08:50 crc kubenswrapper[4885]: I1205 21:08:50.455964 4885 generic.go:334] "Generic (PLEG): container finished" podID="7b9e654d-cb0c-4f31-b798-ddb0da1753a0" containerID="e9b267300c4132af03f21f1c1872d13e0563ad3adb8ab59e6cbac91b59e96d63" exitCode=0 Dec 05 21:08:50 crc kubenswrapper[4885]: I1205 21:08:50.456064 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tcwnz" event={"ID":"7b9e654d-cb0c-4f31-b798-ddb0da1753a0","Type":"ContainerDied","Data":"e9b267300c4132af03f21f1c1872d13e0563ad3adb8ab59e6cbac91b59e96d63"} Dec 05 21:08:50 crc kubenswrapper[4885]: I1205 21:08:50.462226 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fz7m" event={"ID":"e900e882-c31d-4c69-a675-852c562f3c3a","Type":"ContainerStarted","Data":"b1afe36e56d50d2eda0fb781814cb71d775cbea91990e403852e016bcf4b1db4"} Dec 05 21:08:50 crc kubenswrapper[4885]: I1205 21:08:50.500330 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9fz7m" podStartSLOduration=3.103092805 podStartE2EDuration="7.500304601s" podCreationTimestamp="2025-12-05 21:08:43 +0000 UTC" firstStartedPulling="2025-12-05 21:08:45.400753826 +0000 UTC m=+3790.697569487" lastFinishedPulling="2025-12-05 21:08:49.797965622 +0000 UTC m=+3795.094781283" observedRunningTime="2025-12-05 21:08:50.493121507 +0000 UTC m=+3795.789937188" watchObservedRunningTime="2025-12-05 21:08:50.500304601 +0000 UTC m=+3795.797120272" Dec 05 21:08:52 crc kubenswrapper[4885]: I1205 21:08:52.482440 4885 generic.go:334] "Generic (PLEG): container finished" podID="7b9e654d-cb0c-4f31-b798-ddb0da1753a0" containerID="11a1e427d2c1903a1a938ecb51ca0a14e15188d6f64b9d31ee6327aabcc44a80" exitCode=0 Dec 05 21:08:52 crc kubenswrapper[4885]: I1205 21:08:52.482616 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tcwnz" event={"ID":"7b9e654d-cb0c-4f31-b798-ddb0da1753a0","Type":"ContainerDied","Data":"11a1e427d2c1903a1a938ecb51ca0a14e15188d6f64b9d31ee6327aabcc44a80"} Dec 05 21:08:53 crc kubenswrapper[4885]: I1205 21:08:53.812600 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:08:53 crc kubenswrapper[4885]: I1205 21:08:53.813048 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:08:53 crc kubenswrapper[4885]: I1205 21:08:53.881658 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:08:54 crc kubenswrapper[4885]: I1205 21:08:54.173664 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:08:54 crc kubenswrapper[4885]: E1205 21:08:54.173974 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:08:54 crc kubenswrapper[4885]: I1205 21:08:54.501965 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tcwnz" event={"ID":"7b9e654d-cb0c-4f31-b798-ddb0da1753a0","Type":"ContainerStarted","Data":"27a8548e8721d876f2d7085e7ed3d6a14d6b1f33f034242e75f8a1a164e45887"} Dec 05 21:08:54 crc kubenswrapper[4885]: I1205 21:08:54.519472 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tcwnz" podStartSLOduration=3.330202117 podStartE2EDuration="6.519451233s" podCreationTimestamp="2025-12-05 21:08:48 +0000 UTC" firstStartedPulling="2025-12-05 21:08:50.457477763 +0000 UTC m=+3795.754293424" lastFinishedPulling="2025-12-05 21:08:53.646726879 +0000 UTC m=+3798.943542540" observedRunningTime="2025-12-05 21:08:54.516921394 +0000 UTC m=+3799.813737055" watchObservedRunningTime="2025-12-05 21:08:54.519451233 +0000 UTC m=+3799.816266894" Dec 05 21:08:58 crc kubenswrapper[4885]: I1205 21:08:58.506700 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:08:58 crc kubenswrapper[4885]: I1205 21:08:58.508072 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:08:58 crc kubenswrapper[4885]: I1205 21:08:58.555479 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:08:59 crc kubenswrapper[4885]: I1205 21:08:59.591372 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:08:59 crc kubenswrapper[4885]: I1205 21:08:59.640769 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tcwnz"] Dec 05 21:09:01 crc kubenswrapper[4885]: I1205 21:09:01.559093 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tcwnz" podUID="7b9e654d-cb0c-4f31-b798-ddb0da1753a0" containerName="registry-server" containerID="cri-o://27a8548e8721d876f2d7085e7ed3d6a14d6b1f33f034242e75f8a1a164e45887" gracePeriod=2 Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.196787 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.272924 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-catalog-content\") pod \"7b9e654d-cb0c-4f31-b798-ddb0da1753a0\" (UID: \"7b9e654d-cb0c-4f31-b798-ddb0da1753a0\") " Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.273072 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp8zx\" (UniqueName: \"kubernetes.io/projected/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-kube-api-access-qp8zx\") pod \"7b9e654d-cb0c-4f31-b798-ddb0da1753a0\" (UID: \"7b9e654d-cb0c-4f31-b798-ddb0da1753a0\") " Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.273120 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-utilities\") pod \"7b9e654d-cb0c-4f31-b798-ddb0da1753a0\" (UID: \"7b9e654d-cb0c-4f31-b798-ddb0da1753a0\") " Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.274188 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-utilities" (OuterVolumeSpecName: "utilities") pod "7b9e654d-cb0c-4f31-b798-ddb0da1753a0" (UID: "7b9e654d-cb0c-4f31-b798-ddb0da1753a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.280268 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-kube-api-access-qp8zx" (OuterVolumeSpecName: "kube-api-access-qp8zx") pod "7b9e654d-cb0c-4f31-b798-ddb0da1753a0" (UID: "7b9e654d-cb0c-4f31-b798-ddb0da1753a0"). InnerVolumeSpecName "kube-api-access-qp8zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.324232 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b9e654d-cb0c-4f31-b798-ddb0da1753a0" (UID: "7b9e654d-cb0c-4f31-b798-ddb0da1753a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.374856 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.374910 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp8zx\" (UniqueName: \"kubernetes.io/projected/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-kube-api-access-qp8zx\") on node \"crc\" DevicePath \"\"" Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.374922 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b9e654d-cb0c-4f31-b798-ddb0da1753a0-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.571216 4885 generic.go:334] "Generic (PLEG): container finished" podID="7b9e654d-cb0c-4f31-b798-ddb0da1753a0" containerID="27a8548e8721d876f2d7085e7ed3d6a14d6b1f33f034242e75f8a1a164e45887" exitCode=0 Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.571389 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tcwnz" Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.571267 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tcwnz" event={"ID":"7b9e654d-cb0c-4f31-b798-ddb0da1753a0","Type":"ContainerDied","Data":"27a8548e8721d876f2d7085e7ed3d6a14d6b1f33f034242e75f8a1a164e45887"} Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.572007 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tcwnz" event={"ID":"7b9e654d-cb0c-4f31-b798-ddb0da1753a0","Type":"ContainerDied","Data":"89d4b1e8dea13aad74df74a15e6e47cbfa81b9d2284a7d42cd0549fd69a9c774"} Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.572128 4885 scope.go:117] "RemoveContainer" containerID="27a8548e8721d876f2d7085e7ed3d6a14d6b1f33f034242e75f8a1a164e45887" Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.613766 4885 scope.go:117] "RemoveContainer" containerID="11a1e427d2c1903a1a938ecb51ca0a14e15188d6f64b9d31ee6327aabcc44a80" Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.618827 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tcwnz"] Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.635336 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tcwnz"] Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.646965 4885 scope.go:117] "RemoveContainer" containerID="e9b267300c4132af03f21f1c1872d13e0563ad3adb8ab59e6cbac91b59e96d63" Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.698432 4885 scope.go:117] "RemoveContainer" containerID="27a8548e8721d876f2d7085e7ed3d6a14d6b1f33f034242e75f8a1a164e45887" Dec 05 21:09:02 crc kubenswrapper[4885]: E1205 21:09:02.698964 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27a8548e8721d876f2d7085e7ed3d6a14d6b1f33f034242e75f8a1a164e45887\": container with ID starting with 27a8548e8721d876f2d7085e7ed3d6a14d6b1f33f034242e75f8a1a164e45887 not found: ID does not exist" containerID="27a8548e8721d876f2d7085e7ed3d6a14d6b1f33f034242e75f8a1a164e45887" Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.699205 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a8548e8721d876f2d7085e7ed3d6a14d6b1f33f034242e75f8a1a164e45887"} err="failed to get container status \"27a8548e8721d876f2d7085e7ed3d6a14d6b1f33f034242e75f8a1a164e45887\": rpc error: code = NotFound desc = could not find container \"27a8548e8721d876f2d7085e7ed3d6a14d6b1f33f034242e75f8a1a164e45887\": container with ID starting with 27a8548e8721d876f2d7085e7ed3d6a14d6b1f33f034242e75f8a1a164e45887 not found: ID does not exist" Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.699244 4885 scope.go:117] "RemoveContainer" containerID="11a1e427d2c1903a1a938ecb51ca0a14e15188d6f64b9d31ee6327aabcc44a80" Dec 05 21:09:02 crc kubenswrapper[4885]: E1205 21:09:02.699966 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11a1e427d2c1903a1a938ecb51ca0a14e15188d6f64b9d31ee6327aabcc44a80\": container with ID starting with 11a1e427d2c1903a1a938ecb51ca0a14e15188d6f64b9d31ee6327aabcc44a80 not found: ID does not exist" containerID="11a1e427d2c1903a1a938ecb51ca0a14e15188d6f64b9d31ee6327aabcc44a80" Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.699995 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11a1e427d2c1903a1a938ecb51ca0a14e15188d6f64b9d31ee6327aabcc44a80"} err="failed to get container status \"11a1e427d2c1903a1a938ecb51ca0a14e15188d6f64b9d31ee6327aabcc44a80\": rpc error: code = NotFound desc = could not find container \"11a1e427d2c1903a1a938ecb51ca0a14e15188d6f64b9d31ee6327aabcc44a80\": container with ID starting with 11a1e427d2c1903a1a938ecb51ca0a14e15188d6f64b9d31ee6327aabcc44a80 not found: ID does not exist" Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.700009 4885 scope.go:117] "RemoveContainer" containerID="e9b267300c4132af03f21f1c1872d13e0563ad3adb8ab59e6cbac91b59e96d63" Dec 05 21:09:02 crc kubenswrapper[4885]: E1205 21:09:02.700301 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9b267300c4132af03f21f1c1872d13e0563ad3adb8ab59e6cbac91b59e96d63\": container with ID starting with e9b267300c4132af03f21f1c1872d13e0563ad3adb8ab59e6cbac91b59e96d63 not found: ID does not exist" containerID="e9b267300c4132af03f21f1c1872d13e0563ad3adb8ab59e6cbac91b59e96d63" Dec 05 21:09:02 crc kubenswrapper[4885]: I1205 21:09:02.700331 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b267300c4132af03f21f1c1872d13e0563ad3adb8ab59e6cbac91b59e96d63"} err="failed to get container status \"e9b267300c4132af03f21f1c1872d13e0563ad3adb8ab59e6cbac91b59e96d63\": rpc error: code = NotFound desc = could not find container \"e9b267300c4132af03f21f1c1872d13e0563ad3adb8ab59e6cbac91b59e96d63\": container with ID starting with e9b267300c4132af03f21f1c1872d13e0563ad3adb8ab59e6cbac91b59e96d63 not found: ID does not exist" Dec 05 21:09:03 crc kubenswrapper[4885]: I1205 21:09:03.185348 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b9e654d-cb0c-4f31-b798-ddb0da1753a0" path="/var/lib/kubelet/pods/7b9e654d-cb0c-4f31-b798-ddb0da1753a0/volumes" Dec 05 21:09:03 crc kubenswrapper[4885]: I1205 21:09:03.867301 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:09:04 crc kubenswrapper[4885]: I1205 21:09:04.431599 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9fz7m"] Dec 05 21:09:04 crc kubenswrapper[4885]: I1205 21:09:04.652843 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9fz7m" podUID="e900e882-c31d-4c69-a675-852c562f3c3a" containerName="registry-server" containerID="cri-o://b1afe36e56d50d2eda0fb781814cb71d775cbea91990e403852e016bcf4b1db4" gracePeriod=2 Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.182131 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:09:05 crc kubenswrapper[4885]: E1205 21:09:05.182948 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.232577 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.330549 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e900e882-c31d-4c69-a675-852c562f3c3a-utilities\") pod \"e900e882-c31d-4c69-a675-852c562f3c3a\" (UID: \"e900e882-c31d-4c69-a675-852c562f3c3a\") " Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.330679 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e900e882-c31d-4c69-a675-852c562f3c3a-catalog-content\") pod \"e900e882-c31d-4c69-a675-852c562f3c3a\" (UID: \"e900e882-c31d-4c69-a675-852c562f3c3a\") " Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.330810 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvsqn\" (UniqueName: \"kubernetes.io/projected/e900e882-c31d-4c69-a675-852c562f3c3a-kube-api-access-kvsqn\") pod \"e900e882-c31d-4c69-a675-852c562f3c3a\" (UID: \"e900e882-c31d-4c69-a675-852c562f3c3a\") " Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.331536 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e900e882-c31d-4c69-a675-852c562f3c3a-utilities" (OuterVolumeSpecName: "utilities") pod "e900e882-c31d-4c69-a675-852c562f3c3a" (UID: "e900e882-c31d-4c69-a675-852c562f3c3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.336622 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e900e882-c31d-4c69-a675-852c562f3c3a-kube-api-access-kvsqn" (OuterVolumeSpecName: "kube-api-access-kvsqn") pod "e900e882-c31d-4c69-a675-852c562f3c3a" (UID: "e900e882-c31d-4c69-a675-852c562f3c3a"). InnerVolumeSpecName "kube-api-access-kvsqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.388866 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e900e882-c31d-4c69-a675-852c562f3c3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e900e882-c31d-4c69-a675-852c562f3c3a" (UID: "e900e882-c31d-4c69-a675-852c562f3c3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.433871 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e900e882-c31d-4c69-a675-852c562f3c3a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.433913 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e900e882-c31d-4c69-a675-852c562f3c3a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.433924 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvsqn\" (UniqueName: \"kubernetes.io/projected/e900e882-c31d-4c69-a675-852c562f3c3a-kube-api-access-kvsqn\") on node \"crc\" DevicePath \"\"" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.669758 4885 generic.go:334] "Generic (PLEG): container finished" podID="e900e882-c31d-4c69-a675-852c562f3c3a" containerID="b1afe36e56d50d2eda0fb781814cb71d775cbea91990e403852e016bcf4b1db4" exitCode=0 Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.669802 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fz7m" event={"ID":"e900e882-c31d-4c69-a675-852c562f3c3a","Type":"ContainerDied","Data":"b1afe36e56d50d2eda0fb781814cb71d775cbea91990e403852e016bcf4b1db4"} Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.669843 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fz7m" event={"ID":"e900e882-c31d-4c69-a675-852c562f3c3a","Type":"ContainerDied","Data":"8fa5bfe5d9b7ea42d9a3cabe1a2fb90f812fd80fedc15d8abbb809b2545a29e0"} Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.669865 4885 scope.go:117] "RemoveContainer" containerID="b1afe36e56d50d2eda0fb781814cb71d775cbea91990e403852e016bcf4b1db4" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.669862 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fz7m" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.709958 4885 scope.go:117] "RemoveContainer" containerID="cf851447ac6d82376900b22895ca61e59079d3cf92453ae9930415b694d4ef58" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.718164 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9fz7m"] Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.725677 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9fz7m"] Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.729913 4885 scope.go:117] "RemoveContainer" containerID="462517a1eab0ae05bae6884d67064aa77fbcd8e785fab2c4ebc2fa81b0989b5f" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.775790 4885 scope.go:117] "RemoveContainer" containerID="b1afe36e56d50d2eda0fb781814cb71d775cbea91990e403852e016bcf4b1db4" Dec 05 21:09:05 crc kubenswrapper[4885]: E1205 21:09:05.776494 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1afe36e56d50d2eda0fb781814cb71d775cbea91990e403852e016bcf4b1db4\": container with ID starting with b1afe36e56d50d2eda0fb781814cb71d775cbea91990e403852e016bcf4b1db4 not found: ID does not exist" containerID="b1afe36e56d50d2eda0fb781814cb71d775cbea91990e403852e016bcf4b1db4" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.776525 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1afe36e56d50d2eda0fb781814cb71d775cbea91990e403852e016bcf4b1db4"} err="failed to get container status \"b1afe36e56d50d2eda0fb781814cb71d775cbea91990e403852e016bcf4b1db4\": rpc error: code = NotFound desc = could not find container \"b1afe36e56d50d2eda0fb781814cb71d775cbea91990e403852e016bcf4b1db4\": container with ID starting with b1afe36e56d50d2eda0fb781814cb71d775cbea91990e403852e016bcf4b1db4 not found: ID does not exist" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.776547 4885 scope.go:117] "RemoveContainer" containerID="cf851447ac6d82376900b22895ca61e59079d3cf92453ae9930415b694d4ef58" Dec 05 21:09:05 crc kubenswrapper[4885]: E1205 21:09:05.776889 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf851447ac6d82376900b22895ca61e59079d3cf92453ae9930415b694d4ef58\": container with ID starting with cf851447ac6d82376900b22895ca61e59079d3cf92453ae9930415b694d4ef58 not found: ID does not exist" containerID="cf851447ac6d82376900b22895ca61e59079d3cf92453ae9930415b694d4ef58" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.777006 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf851447ac6d82376900b22895ca61e59079d3cf92453ae9930415b694d4ef58"} err="failed to get container status \"cf851447ac6d82376900b22895ca61e59079d3cf92453ae9930415b694d4ef58\": rpc error: code = NotFound desc = could not find container \"cf851447ac6d82376900b22895ca61e59079d3cf92453ae9930415b694d4ef58\": container with ID starting with cf851447ac6d82376900b22895ca61e59079d3cf92453ae9930415b694d4ef58 not found: ID does not exist" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.777122 4885 scope.go:117] "RemoveContainer" containerID="462517a1eab0ae05bae6884d67064aa77fbcd8e785fab2c4ebc2fa81b0989b5f" Dec 05 21:09:05 crc kubenswrapper[4885]: E1205 21:09:05.777527 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"462517a1eab0ae05bae6884d67064aa77fbcd8e785fab2c4ebc2fa81b0989b5f\": container with ID starting with 462517a1eab0ae05bae6884d67064aa77fbcd8e785fab2c4ebc2fa81b0989b5f not found: ID does not exist" containerID="462517a1eab0ae05bae6884d67064aa77fbcd8e785fab2c4ebc2fa81b0989b5f" Dec 05 21:09:05 crc kubenswrapper[4885]: I1205 21:09:05.777611 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"462517a1eab0ae05bae6884d67064aa77fbcd8e785fab2c4ebc2fa81b0989b5f"} err="failed to get container status \"462517a1eab0ae05bae6884d67064aa77fbcd8e785fab2c4ebc2fa81b0989b5f\": rpc error: code = NotFound desc = could not find container \"462517a1eab0ae05bae6884d67064aa77fbcd8e785fab2c4ebc2fa81b0989b5f\": container with ID starting with 462517a1eab0ae05bae6884d67064aa77fbcd8e785fab2c4ebc2fa81b0989b5f not found: ID does not exist" Dec 05 21:09:07 crc kubenswrapper[4885]: I1205 21:09:07.185618 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e900e882-c31d-4c69-a675-852c562f3c3a" path="/var/lib/kubelet/pods/e900e882-c31d-4c69-a675-852c562f3c3a/volumes" Dec 05 21:09:19 crc kubenswrapper[4885]: I1205 21:09:19.175818 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:09:19 crc kubenswrapper[4885]: E1205 21:09:19.176666 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:09:34 crc kubenswrapper[4885]: I1205 21:09:34.172868 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:09:34 crc kubenswrapper[4885]: E1205 21:09:34.173799 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:09:46 crc kubenswrapper[4885]: I1205 21:09:46.172837 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:09:46 crc kubenswrapper[4885]: E1205 21:09:46.173579 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:09:56 crc kubenswrapper[4885]: I1205 21:09:56.519226 4885 scope.go:117] "RemoveContainer" containerID="a384ac356d493bf928123b47f73475cb4580dd254cd1dc948bda92640649b8cb" Dec 05 21:09:58 crc kubenswrapper[4885]: I1205 21:09:58.173198 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:09:58 crc kubenswrapper[4885]: E1205 21:09:58.173744 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:10:10 crc kubenswrapper[4885]: I1205 21:10:10.172961 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:10:10 crc kubenswrapper[4885]: E1205 21:10:10.173659 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:10:22 crc kubenswrapper[4885]: I1205 21:10:22.173737 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:10:22 crc kubenswrapper[4885]: E1205 21:10:22.174819 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:10:37 crc kubenswrapper[4885]: I1205 21:10:37.173557 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:10:37 crc kubenswrapper[4885]: E1205 21:10:37.174830 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:10:48 crc kubenswrapper[4885]: I1205 21:10:48.172893 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:10:48 crc kubenswrapper[4885]: I1205 21:10:48.714841 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"205f14d82d152bace567ea4b3c4f4a866de21c49423e552f7b835ff0dc2520e5"} Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.702178 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gc5r4/must-gather-phlff"] Dec 05 21:10:53 crc kubenswrapper[4885]: E1205 21:10:53.703227 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9e654d-cb0c-4f31-b798-ddb0da1753a0" containerName="registry-server" Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.703245 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9e654d-cb0c-4f31-b798-ddb0da1753a0" containerName="registry-server" Dec 05 21:10:53 crc kubenswrapper[4885]: E1205 21:10:53.703266 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e900e882-c31d-4c69-a675-852c562f3c3a" containerName="extract-content" Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.703273 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e900e882-c31d-4c69-a675-852c562f3c3a" containerName="extract-content" Dec 05 21:10:53 crc kubenswrapper[4885]: E1205 21:10:53.703288 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9e654d-cb0c-4f31-b798-ddb0da1753a0" containerName="extract-utilities" Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.703297 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9e654d-cb0c-4f31-b798-ddb0da1753a0" containerName="extract-utilities" Dec 05 21:10:53 crc kubenswrapper[4885]: E1205 21:10:53.703313 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9e654d-cb0c-4f31-b798-ddb0da1753a0" containerName="extract-content" Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.703321 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9e654d-cb0c-4f31-b798-ddb0da1753a0" containerName="extract-content" Dec 05 21:10:53 crc kubenswrapper[4885]: E1205 21:10:53.703337 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e900e882-c31d-4c69-a675-852c562f3c3a" containerName="registry-server" Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.703344 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e900e882-c31d-4c69-a675-852c562f3c3a" containerName="registry-server" Dec 05 21:10:53 crc kubenswrapper[4885]: E1205 21:10:53.703361 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e900e882-c31d-4c69-a675-852c562f3c3a" containerName="extract-utilities" Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.703369 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e900e882-c31d-4c69-a675-852c562f3c3a" containerName="extract-utilities" Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.703582 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e900e882-c31d-4c69-a675-852c562f3c3a" containerName="registry-server" Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.703617 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9e654d-cb0c-4f31-b798-ddb0da1753a0" containerName="registry-server" Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.704912 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc5r4/must-gather-phlff" Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.710065 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gc5r4"/"kube-root-ca.crt" Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.712048 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gc5r4"/"openshift-service-ca.crt" Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.782648 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gc5r4/must-gather-phlff"] Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.881712 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/953be525-991e-40cc-9321-2f8065e030ef-must-gather-output\") pod \"must-gather-phlff\" (UID: \"953be525-991e-40cc-9321-2f8065e030ef\") " pod="openshift-must-gather-gc5r4/must-gather-phlff" Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.882113 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvppm\" (UniqueName: \"kubernetes.io/projected/953be525-991e-40cc-9321-2f8065e030ef-kube-api-access-kvppm\") pod \"must-gather-phlff\" (UID: \"953be525-991e-40cc-9321-2f8065e030ef\") " pod="openshift-must-gather-gc5r4/must-gather-phlff" Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.983393 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvppm\" (UniqueName: \"kubernetes.io/projected/953be525-991e-40cc-9321-2f8065e030ef-kube-api-access-kvppm\") pod \"must-gather-phlff\" (UID: \"953be525-991e-40cc-9321-2f8065e030ef\") " pod="openshift-must-gather-gc5r4/must-gather-phlff" Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.983511 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/953be525-991e-40cc-9321-2f8065e030ef-must-gather-output\") pod \"must-gather-phlff\" (UID: \"953be525-991e-40cc-9321-2f8065e030ef\") " pod="openshift-must-gather-gc5r4/must-gather-phlff" Dec 05 21:10:53 crc kubenswrapper[4885]: I1205 21:10:53.984057 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/953be525-991e-40cc-9321-2f8065e030ef-must-gather-output\") pod \"must-gather-phlff\" (UID: \"953be525-991e-40cc-9321-2f8065e030ef\") " pod="openshift-must-gather-gc5r4/must-gather-phlff" Dec 05 21:10:54 crc kubenswrapper[4885]: I1205 21:10:54.009574 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvppm\" (UniqueName: \"kubernetes.io/projected/953be525-991e-40cc-9321-2f8065e030ef-kube-api-access-kvppm\") pod \"must-gather-phlff\" (UID: \"953be525-991e-40cc-9321-2f8065e030ef\") " pod="openshift-must-gather-gc5r4/must-gather-phlff" Dec 05 21:10:54 crc kubenswrapper[4885]: I1205 21:10:54.031050 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc5r4/must-gather-phlff" Dec 05 21:10:54 crc kubenswrapper[4885]: I1205 21:10:54.531509 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gc5r4/must-gather-phlff"] Dec 05 21:10:54 crc kubenswrapper[4885]: W1205 21:10:54.542930 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod953be525_991e_40cc_9321_2f8065e030ef.slice/crio-0c2f4fefadcacb80ba800cbf8463f0665b0d2c2c007dcb0c36cf002f02a47df4 WatchSource:0}: Error finding container 0c2f4fefadcacb80ba800cbf8463f0665b0d2c2c007dcb0c36cf002f02a47df4: Status 404 returned error can't find the container with id 0c2f4fefadcacb80ba800cbf8463f0665b0d2c2c007dcb0c36cf002f02a47df4 Dec 05 21:10:54 crc kubenswrapper[4885]: I1205 21:10:54.774706 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gc5r4/must-gather-phlff" event={"ID":"953be525-991e-40cc-9321-2f8065e030ef","Type":"ContainerStarted","Data":"23de8953ab59e495bd2fbef978b61c079c5a0757015152c606f4a3ce17ddb65d"} Dec 05 21:10:54 crc kubenswrapper[4885]: I1205 21:10:54.774984 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gc5r4/must-gather-phlff" event={"ID":"953be525-991e-40cc-9321-2f8065e030ef","Type":"ContainerStarted","Data":"0c2f4fefadcacb80ba800cbf8463f0665b0d2c2c007dcb0c36cf002f02a47df4"} Dec 05 21:10:55 crc kubenswrapper[4885]: I1205 21:10:55.785682 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gc5r4/must-gather-phlff" event={"ID":"953be525-991e-40cc-9321-2f8065e030ef","Type":"ContainerStarted","Data":"9207e858c9fe8b1a77fd623613f666be446aaa9a0adb61716fdcbf72bddb978a"} Dec 05 21:10:55 crc kubenswrapper[4885]: I1205 21:10:55.809616 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gc5r4/must-gather-phlff" podStartSLOduration=2.8095924979999998 podStartE2EDuration="2.809592498s" podCreationTimestamp="2025-12-05 21:10:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:10:55.8074107 +0000 UTC m=+3921.104226361" watchObservedRunningTime="2025-12-05 21:10:55.809592498 +0000 UTC m=+3921.106408159" Dec 05 21:10:58 crc kubenswrapper[4885]: I1205 21:10:58.673514 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gc5r4/crc-debug-5rc2k"] Dec 05 21:10:58 crc kubenswrapper[4885]: I1205 21:10:58.676554 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc5r4/crc-debug-5rc2k" Dec 05 21:10:58 crc kubenswrapper[4885]: I1205 21:10:58.678329 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-gc5r4"/"default-dockercfg-qd8px" Dec 05 21:10:58 crc kubenswrapper[4885]: I1205 21:10:58.770280 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1399560d-edca-44ef-98ca-a252533542f1-host\") pod \"crc-debug-5rc2k\" (UID: \"1399560d-edca-44ef-98ca-a252533542f1\") " pod="openshift-must-gather-gc5r4/crc-debug-5rc2k" Dec 05 21:10:58 crc kubenswrapper[4885]: I1205 21:10:58.770617 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxm4x\" (UniqueName: \"kubernetes.io/projected/1399560d-edca-44ef-98ca-a252533542f1-kube-api-access-bxm4x\") pod \"crc-debug-5rc2k\" (UID: \"1399560d-edca-44ef-98ca-a252533542f1\") " pod="openshift-must-gather-gc5r4/crc-debug-5rc2k" Dec 05 21:10:58 crc kubenswrapper[4885]: I1205 21:10:58.872423 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1399560d-edca-44ef-98ca-a252533542f1-host\") pod \"crc-debug-5rc2k\" (UID: \"1399560d-edca-44ef-98ca-a252533542f1\") " pod="openshift-must-gather-gc5r4/crc-debug-5rc2k" Dec 05 21:10:58 crc kubenswrapper[4885]: I1205 21:10:58.872552 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxm4x\" (UniqueName: \"kubernetes.io/projected/1399560d-edca-44ef-98ca-a252533542f1-kube-api-access-bxm4x\") pod \"crc-debug-5rc2k\" (UID: \"1399560d-edca-44ef-98ca-a252533542f1\") " pod="openshift-must-gather-gc5r4/crc-debug-5rc2k" Dec 05 21:10:58 crc kubenswrapper[4885]: I1205 21:10:58.872568 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1399560d-edca-44ef-98ca-a252533542f1-host\") pod \"crc-debug-5rc2k\" (UID: \"1399560d-edca-44ef-98ca-a252533542f1\") " pod="openshift-must-gather-gc5r4/crc-debug-5rc2k" Dec 05 21:10:58 crc kubenswrapper[4885]: I1205 21:10:58.908475 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxm4x\" (UniqueName: \"kubernetes.io/projected/1399560d-edca-44ef-98ca-a252533542f1-kube-api-access-bxm4x\") pod \"crc-debug-5rc2k\" (UID: \"1399560d-edca-44ef-98ca-a252533542f1\") " pod="openshift-must-gather-gc5r4/crc-debug-5rc2k" Dec 05 21:10:58 crc kubenswrapper[4885]: I1205 21:10:58.994759 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc5r4/crc-debug-5rc2k" Dec 05 21:10:59 crc kubenswrapper[4885]: W1205 21:10:59.049732 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1399560d_edca_44ef_98ca_a252533542f1.slice/crio-46aba159ada9a155fe46796801a3dcc20f40c9befbb959c3ff4a99f5cad403d7 WatchSource:0}: Error finding container 46aba159ada9a155fe46796801a3dcc20f40c9befbb959c3ff4a99f5cad403d7: Status 404 returned error can't find the container with id 46aba159ada9a155fe46796801a3dcc20f40c9befbb959c3ff4a99f5cad403d7 Dec 05 21:10:59 crc kubenswrapper[4885]: I1205 21:10:59.823437 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gc5r4/crc-debug-5rc2k" event={"ID":"1399560d-edca-44ef-98ca-a252533542f1","Type":"ContainerStarted","Data":"fe6dffa1955a9b36b5efe9259644573f798f4ad07ff2268d17fcc9ae17f960ca"} Dec 05 21:10:59 crc kubenswrapper[4885]: I1205 21:10:59.824031 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gc5r4/crc-debug-5rc2k" event={"ID":"1399560d-edca-44ef-98ca-a252533542f1","Type":"ContainerStarted","Data":"46aba159ada9a155fe46796801a3dcc20f40c9befbb959c3ff4a99f5cad403d7"} Dec 05 21:10:59 crc kubenswrapper[4885]: I1205 21:10:59.841381 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gc5r4/crc-debug-5rc2k" podStartSLOduration=1.841356565 podStartE2EDuration="1.841356565s" podCreationTimestamp="2025-12-05 21:10:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:10:59.837995609 +0000 UTC m=+3925.134811270" watchObservedRunningTime="2025-12-05 21:10:59.841356565 +0000 UTC m=+3925.138172226" Dec 05 21:11:08 crc kubenswrapper[4885]: I1205 21:11:08.649176 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lcsjc"] Dec 05 21:11:08 crc kubenswrapper[4885]: I1205 21:11:08.652555 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:08 crc kubenswrapper[4885]: I1205 21:11:08.665405 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lcsjc"] Dec 05 21:11:08 crc kubenswrapper[4885]: I1205 21:11:08.676863 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bef6c205-64fb-46e3-8621-774b262b93a0-catalog-content\") pod \"redhat-marketplace-lcsjc\" (UID: \"bef6c205-64fb-46e3-8621-774b262b93a0\") " pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:08 crc kubenswrapper[4885]: I1205 21:11:08.676970 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcg89\" (UniqueName: \"kubernetes.io/projected/bef6c205-64fb-46e3-8621-774b262b93a0-kube-api-access-mcg89\") pod \"redhat-marketplace-lcsjc\" (UID: \"bef6c205-64fb-46e3-8621-774b262b93a0\") " pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:08 crc kubenswrapper[4885]: I1205 21:11:08.676995 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bef6c205-64fb-46e3-8621-774b262b93a0-utilities\") pod \"redhat-marketplace-lcsjc\" (UID: \"bef6c205-64fb-46e3-8621-774b262b93a0\") " pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:08 crc kubenswrapper[4885]: I1205 21:11:08.779377 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bef6c205-64fb-46e3-8621-774b262b93a0-catalog-content\") pod \"redhat-marketplace-lcsjc\" (UID: \"bef6c205-64fb-46e3-8621-774b262b93a0\") " pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:08 crc kubenswrapper[4885]: I1205 21:11:08.779515 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcg89\" (UniqueName: \"kubernetes.io/projected/bef6c205-64fb-46e3-8621-774b262b93a0-kube-api-access-mcg89\") pod \"redhat-marketplace-lcsjc\" (UID: \"bef6c205-64fb-46e3-8621-774b262b93a0\") " pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:08 crc kubenswrapper[4885]: I1205 21:11:08.779546 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bef6c205-64fb-46e3-8621-774b262b93a0-utilities\") pod \"redhat-marketplace-lcsjc\" (UID: \"bef6c205-64fb-46e3-8621-774b262b93a0\") " pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:08 crc kubenswrapper[4885]: I1205 21:11:08.780002 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bef6c205-64fb-46e3-8621-774b262b93a0-utilities\") pod \"redhat-marketplace-lcsjc\" (UID: \"bef6c205-64fb-46e3-8621-774b262b93a0\") " pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:08 crc kubenswrapper[4885]: I1205 21:11:08.780033 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bef6c205-64fb-46e3-8621-774b262b93a0-catalog-content\") pod \"redhat-marketplace-lcsjc\" (UID: \"bef6c205-64fb-46e3-8621-774b262b93a0\") " pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:08 crc kubenswrapper[4885]: I1205 21:11:08.801976 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcg89\" (UniqueName: \"kubernetes.io/projected/bef6c205-64fb-46e3-8621-774b262b93a0-kube-api-access-mcg89\") pod \"redhat-marketplace-lcsjc\" (UID: \"bef6c205-64fb-46e3-8621-774b262b93a0\") " pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:08 crc kubenswrapper[4885]: I1205 21:11:08.976600 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:09 crc kubenswrapper[4885]: I1205 21:11:09.470695 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lcsjc"] Dec 05 21:11:09 crc kubenswrapper[4885]: I1205 21:11:09.914499 4885 generic.go:334] "Generic (PLEG): container finished" podID="bef6c205-64fb-46e3-8621-774b262b93a0" containerID="a10b7d3d30df022e6a4b88f2567df5cec266fe96024ae079b9a7fc8e292478dc" exitCode=0 Dec 05 21:11:09 crc kubenswrapper[4885]: I1205 21:11:09.914685 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lcsjc" event={"ID":"bef6c205-64fb-46e3-8621-774b262b93a0","Type":"ContainerDied","Data":"a10b7d3d30df022e6a4b88f2567df5cec266fe96024ae079b9a7fc8e292478dc"} Dec 05 21:11:09 crc kubenswrapper[4885]: I1205 21:11:09.915907 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lcsjc" event={"ID":"bef6c205-64fb-46e3-8621-774b262b93a0","Type":"ContainerStarted","Data":"382c1cab9ff253e5e642ed7119f9d312b79f710807cd5201e70aaa9ea7db9a3e"} Dec 05 21:11:11 crc kubenswrapper[4885]: I1205 21:11:11.938621 4885 generic.go:334] "Generic (PLEG): container finished" podID="bef6c205-64fb-46e3-8621-774b262b93a0" containerID="d5f284c9079214a1afb5c00fc84246c6521f497f16b040f24bd3ac8d4fa807d6" exitCode=0 Dec 05 21:11:11 crc kubenswrapper[4885]: I1205 21:11:11.938749 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lcsjc" event={"ID":"bef6c205-64fb-46e3-8621-774b262b93a0","Type":"ContainerDied","Data":"d5f284c9079214a1afb5c00fc84246c6521f497f16b040f24bd3ac8d4fa807d6"} Dec 05 21:11:12 crc kubenswrapper[4885]: I1205 21:11:12.948687 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lcsjc" event={"ID":"bef6c205-64fb-46e3-8621-774b262b93a0","Type":"ContainerStarted","Data":"8f9622e507f723f973c8a00c9417751f1daeca2543fc4dcd758c55b17322c08f"} Dec 05 21:11:12 crc kubenswrapper[4885]: I1205 21:11:12.968685 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lcsjc" podStartSLOduration=2.414824317 podStartE2EDuration="4.968660116s" podCreationTimestamp="2025-12-05 21:11:08 +0000 UTC" firstStartedPulling="2025-12-05 21:11:09.918347291 +0000 UTC m=+3935.215162962" lastFinishedPulling="2025-12-05 21:11:12.4721831 +0000 UTC m=+3937.768998761" observedRunningTime="2025-12-05 21:11:12.96589756 +0000 UTC m=+3938.262713231" watchObservedRunningTime="2025-12-05 21:11:12.968660116 +0000 UTC m=+3938.265475777" Dec 05 21:11:18 crc kubenswrapper[4885]: I1205 21:11:18.976746 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:18 crc kubenswrapper[4885]: I1205 21:11:18.977334 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:19 crc kubenswrapper[4885]: I1205 21:11:19.039623 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:19 crc kubenswrapper[4885]: I1205 21:11:19.101956 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:19 crc kubenswrapper[4885]: I1205 21:11:19.279114 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lcsjc"] Dec 05 21:11:21 crc kubenswrapper[4885]: I1205 21:11:21.014962 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lcsjc" podUID="bef6c205-64fb-46e3-8621-774b262b93a0" containerName="registry-server" containerID="cri-o://8f9622e507f723f973c8a00c9417751f1daeca2543fc4dcd758c55b17322c08f" gracePeriod=2 Dec 05 21:11:21 crc kubenswrapper[4885]: I1205 21:11:21.640277 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:21 crc kubenswrapper[4885]: I1205 21:11:21.741643 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcg89\" (UniqueName: \"kubernetes.io/projected/bef6c205-64fb-46e3-8621-774b262b93a0-kube-api-access-mcg89\") pod \"bef6c205-64fb-46e3-8621-774b262b93a0\" (UID: \"bef6c205-64fb-46e3-8621-774b262b93a0\") " Dec 05 21:11:21 crc kubenswrapper[4885]: I1205 21:11:21.741825 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bef6c205-64fb-46e3-8621-774b262b93a0-utilities\") pod \"bef6c205-64fb-46e3-8621-774b262b93a0\" (UID: \"bef6c205-64fb-46e3-8621-774b262b93a0\") " Dec 05 21:11:21 crc kubenswrapper[4885]: I1205 21:11:21.741949 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bef6c205-64fb-46e3-8621-774b262b93a0-catalog-content\") pod \"bef6c205-64fb-46e3-8621-774b262b93a0\" (UID: \"bef6c205-64fb-46e3-8621-774b262b93a0\") " Dec 05 21:11:21 crc kubenswrapper[4885]: I1205 21:11:21.743041 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bef6c205-64fb-46e3-8621-774b262b93a0-utilities" (OuterVolumeSpecName: "utilities") pod "bef6c205-64fb-46e3-8621-774b262b93a0" (UID: "bef6c205-64fb-46e3-8621-774b262b93a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:11:21 crc kubenswrapper[4885]: I1205 21:11:21.747261 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bef6c205-64fb-46e3-8621-774b262b93a0-kube-api-access-mcg89" (OuterVolumeSpecName: "kube-api-access-mcg89") pod "bef6c205-64fb-46e3-8621-774b262b93a0" (UID: "bef6c205-64fb-46e3-8621-774b262b93a0"). InnerVolumeSpecName "kube-api-access-mcg89". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:11:21 crc kubenswrapper[4885]: I1205 21:11:21.760529 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bef6c205-64fb-46e3-8621-774b262b93a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bef6c205-64fb-46e3-8621-774b262b93a0" (UID: "bef6c205-64fb-46e3-8621-774b262b93a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:11:21 crc kubenswrapper[4885]: I1205 21:11:21.844171 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bef6c205-64fb-46e3-8621-774b262b93a0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:11:21 crc kubenswrapper[4885]: I1205 21:11:21.844211 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcg89\" (UniqueName: \"kubernetes.io/projected/bef6c205-64fb-46e3-8621-774b262b93a0-kube-api-access-mcg89\") on node \"crc\" DevicePath \"\"" Dec 05 21:11:21 crc kubenswrapper[4885]: I1205 21:11:21.844224 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bef6c205-64fb-46e3-8621-774b262b93a0-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:11:22 crc kubenswrapper[4885]: I1205 21:11:22.025593 4885 generic.go:334] "Generic (PLEG): container finished" podID="bef6c205-64fb-46e3-8621-774b262b93a0" containerID="8f9622e507f723f973c8a00c9417751f1daeca2543fc4dcd758c55b17322c08f" exitCode=0 Dec 05 21:11:22 crc kubenswrapper[4885]: I1205 21:11:22.025639 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lcsjc" event={"ID":"bef6c205-64fb-46e3-8621-774b262b93a0","Type":"ContainerDied","Data":"8f9622e507f723f973c8a00c9417751f1daeca2543fc4dcd758c55b17322c08f"} Dec 05 21:11:22 crc kubenswrapper[4885]: I1205 21:11:22.025654 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lcsjc" Dec 05 21:11:22 crc kubenswrapper[4885]: I1205 21:11:22.025669 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lcsjc" event={"ID":"bef6c205-64fb-46e3-8621-774b262b93a0","Type":"ContainerDied","Data":"382c1cab9ff253e5e642ed7119f9d312b79f710807cd5201e70aaa9ea7db9a3e"} Dec 05 21:11:22 crc kubenswrapper[4885]: I1205 21:11:22.025694 4885 scope.go:117] "RemoveContainer" containerID="8f9622e507f723f973c8a00c9417751f1daeca2543fc4dcd758c55b17322c08f" Dec 05 21:11:22 crc kubenswrapper[4885]: I1205 21:11:22.057124 4885 scope.go:117] "RemoveContainer" containerID="d5f284c9079214a1afb5c00fc84246c6521f497f16b040f24bd3ac8d4fa807d6" Dec 05 21:11:22 crc kubenswrapper[4885]: I1205 21:11:22.062467 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lcsjc"] Dec 05 21:11:22 crc kubenswrapper[4885]: I1205 21:11:22.072777 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lcsjc"] Dec 05 21:11:22 crc kubenswrapper[4885]: I1205 21:11:22.084352 4885 scope.go:117] "RemoveContainer" containerID="a10b7d3d30df022e6a4b88f2567df5cec266fe96024ae079b9a7fc8e292478dc" Dec 05 21:11:22 crc kubenswrapper[4885]: I1205 21:11:22.127688 4885 scope.go:117] "RemoveContainer" containerID="8f9622e507f723f973c8a00c9417751f1daeca2543fc4dcd758c55b17322c08f" Dec 05 21:11:22 crc kubenswrapper[4885]: E1205 21:11:22.128158 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f9622e507f723f973c8a00c9417751f1daeca2543fc4dcd758c55b17322c08f\": container with ID starting with 8f9622e507f723f973c8a00c9417751f1daeca2543fc4dcd758c55b17322c08f not found: ID does not exist" containerID="8f9622e507f723f973c8a00c9417751f1daeca2543fc4dcd758c55b17322c08f" Dec 05 21:11:22 crc kubenswrapper[4885]: I1205 21:11:22.128236 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9622e507f723f973c8a00c9417751f1daeca2543fc4dcd758c55b17322c08f"} err="failed to get container status \"8f9622e507f723f973c8a00c9417751f1daeca2543fc4dcd758c55b17322c08f\": rpc error: code = NotFound desc = could not find container \"8f9622e507f723f973c8a00c9417751f1daeca2543fc4dcd758c55b17322c08f\": container with ID starting with 8f9622e507f723f973c8a00c9417751f1daeca2543fc4dcd758c55b17322c08f not found: ID does not exist" Dec 05 21:11:22 crc kubenswrapper[4885]: I1205 21:11:22.128260 4885 scope.go:117] "RemoveContainer" containerID="d5f284c9079214a1afb5c00fc84246c6521f497f16b040f24bd3ac8d4fa807d6" Dec 05 21:11:22 crc kubenswrapper[4885]: E1205 21:11:22.128788 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5f284c9079214a1afb5c00fc84246c6521f497f16b040f24bd3ac8d4fa807d6\": container with ID starting with d5f284c9079214a1afb5c00fc84246c6521f497f16b040f24bd3ac8d4fa807d6 not found: ID does not exist" containerID="d5f284c9079214a1afb5c00fc84246c6521f497f16b040f24bd3ac8d4fa807d6" Dec 05 21:11:22 crc kubenswrapper[4885]: I1205 21:11:22.128837 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5f284c9079214a1afb5c00fc84246c6521f497f16b040f24bd3ac8d4fa807d6"} err="failed to get container status \"d5f284c9079214a1afb5c00fc84246c6521f497f16b040f24bd3ac8d4fa807d6\": rpc error: code = NotFound desc = could not find container \"d5f284c9079214a1afb5c00fc84246c6521f497f16b040f24bd3ac8d4fa807d6\": container with ID starting with d5f284c9079214a1afb5c00fc84246c6521f497f16b040f24bd3ac8d4fa807d6 not found: ID does not exist" Dec 05 21:11:22 crc kubenswrapper[4885]: I1205 21:11:22.128869 4885 scope.go:117] "RemoveContainer" containerID="a10b7d3d30df022e6a4b88f2567df5cec266fe96024ae079b9a7fc8e292478dc" Dec 05 21:11:22 crc kubenswrapper[4885]: E1205 21:11:22.129168 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a10b7d3d30df022e6a4b88f2567df5cec266fe96024ae079b9a7fc8e292478dc\": container with ID starting with a10b7d3d30df022e6a4b88f2567df5cec266fe96024ae079b9a7fc8e292478dc not found: ID does not exist" containerID="a10b7d3d30df022e6a4b88f2567df5cec266fe96024ae079b9a7fc8e292478dc" Dec 05 21:11:22 crc kubenswrapper[4885]: I1205 21:11:22.129196 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a10b7d3d30df022e6a4b88f2567df5cec266fe96024ae079b9a7fc8e292478dc"} err="failed to get container status \"a10b7d3d30df022e6a4b88f2567df5cec266fe96024ae079b9a7fc8e292478dc\": rpc error: code = NotFound desc = could not find container \"a10b7d3d30df022e6a4b88f2567df5cec266fe96024ae079b9a7fc8e292478dc\": container with ID starting with a10b7d3d30df022e6a4b88f2567df5cec266fe96024ae079b9a7fc8e292478dc not found: ID does not exist" Dec 05 21:11:23 crc kubenswrapper[4885]: I1205 21:11:23.209420 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bef6c205-64fb-46e3-8621-774b262b93a0" path="/var/lib/kubelet/pods/bef6c205-64fb-46e3-8621-774b262b93a0/volumes" Dec 05 21:11:34 crc kubenswrapper[4885]: I1205 21:11:34.168982 4885 generic.go:334] "Generic (PLEG): container finished" podID="1399560d-edca-44ef-98ca-a252533542f1" containerID="fe6dffa1955a9b36b5efe9259644573f798f4ad07ff2268d17fcc9ae17f960ca" exitCode=0 Dec 05 21:11:34 crc kubenswrapper[4885]: I1205 21:11:34.169045 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gc5r4/crc-debug-5rc2k" event={"ID":"1399560d-edca-44ef-98ca-a252533542f1","Type":"ContainerDied","Data":"fe6dffa1955a9b36b5efe9259644573f798f4ad07ff2268d17fcc9ae17f960ca"} Dec 05 21:11:35 crc kubenswrapper[4885]: I1205 21:11:35.285541 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc5r4/crc-debug-5rc2k" Dec 05 21:11:35 crc kubenswrapper[4885]: I1205 21:11:35.325545 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1399560d-edca-44ef-98ca-a252533542f1-host\") pod \"1399560d-edca-44ef-98ca-a252533542f1\" (UID: \"1399560d-edca-44ef-98ca-a252533542f1\") " Dec 05 21:11:35 crc kubenswrapper[4885]: I1205 21:11:35.325625 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1399560d-edca-44ef-98ca-a252533542f1-host" (OuterVolumeSpecName: "host") pod "1399560d-edca-44ef-98ca-a252533542f1" (UID: "1399560d-edca-44ef-98ca-a252533542f1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:11:35 crc kubenswrapper[4885]: I1205 21:11:35.325663 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxm4x\" (UniqueName: \"kubernetes.io/projected/1399560d-edca-44ef-98ca-a252533542f1-kube-api-access-bxm4x\") pod \"1399560d-edca-44ef-98ca-a252533542f1\" (UID: \"1399560d-edca-44ef-98ca-a252533542f1\") " Dec 05 21:11:35 crc kubenswrapper[4885]: I1205 21:11:35.326114 4885 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1399560d-edca-44ef-98ca-a252533542f1-host\") on node \"crc\" DevicePath \"\"" Dec 05 21:11:35 crc kubenswrapper[4885]: I1205 21:11:35.329976 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gc5r4/crc-debug-5rc2k"] Dec 05 21:11:35 crc kubenswrapper[4885]: I1205 21:11:35.331318 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1399560d-edca-44ef-98ca-a252533542f1-kube-api-access-bxm4x" (OuterVolumeSpecName: "kube-api-access-bxm4x") pod "1399560d-edca-44ef-98ca-a252533542f1" (UID: "1399560d-edca-44ef-98ca-a252533542f1"). InnerVolumeSpecName "kube-api-access-bxm4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:11:35 crc kubenswrapper[4885]: I1205 21:11:35.343407 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gc5r4/crc-debug-5rc2k"] Dec 05 21:11:35 crc kubenswrapper[4885]: I1205 21:11:35.428366 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxm4x\" (UniqueName: \"kubernetes.io/projected/1399560d-edca-44ef-98ca-a252533542f1-kube-api-access-bxm4x\") on node \"crc\" DevicePath \"\"" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.196899 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46aba159ada9a155fe46796801a3dcc20f40c9befbb959c3ff4a99f5cad403d7" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.196959 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc5r4/crc-debug-5rc2k" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.544459 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gc5r4/crc-debug-qd7bw"] Dec 05 21:11:36 crc kubenswrapper[4885]: E1205 21:11:36.544834 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef6c205-64fb-46e3-8621-774b262b93a0" containerName="extract-content" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.544847 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef6c205-64fb-46e3-8621-774b262b93a0" containerName="extract-content" Dec 05 21:11:36 crc kubenswrapper[4885]: E1205 21:11:36.544878 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1399560d-edca-44ef-98ca-a252533542f1" containerName="container-00" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.544884 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1399560d-edca-44ef-98ca-a252533542f1" containerName="container-00" Dec 05 21:11:36 crc kubenswrapper[4885]: E1205 21:11:36.544895 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef6c205-64fb-46e3-8621-774b262b93a0" containerName="registry-server" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.544901 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef6c205-64fb-46e3-8621-774b262b93a0" containerName="registry-server" Dec 05 21:11:36 crc kubenswrapper[4885]: E1205 21:11:36.544915 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef6c205-64fb-46e3-8621-774b262b93a0" containerName="extract-utilities" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.544921 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef6c205-64fb-46e3-8621-774b262b93a0" containerName="extract-utilities" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.545160 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef6c205-64fb-46e3-8621-774b262b93a0" containerName="registry-server" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.545182 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1399560d-edca-44ef-98ca-a252533542f1" containerName="container-00" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.545790 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc5r4/crc-debug-qd7bw" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.548153 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-gc5r4"/"default-dockercfg-qd8px" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.554269 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77jcf\" (UniqueName: \"kubernetes.io/projected/23fd8bb3-fcb4-47bf-85f4-21b67d141823-kube-api-access-77jcf\") pod \"crc-debug-qd7bw\" (UID: \"23fd8bb3-fcb4-47bf-85f4-21b67d141823\") " pod="openshift-must-gather-gc5r4/crc-debug-qd7bw" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.554652 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23fd8bb3-fcb4-47bf-85f4-21b67d141823-host\") pod \"crc-debug-qd7bw\" (UID: \"23fd8bb3-fcb4-47bf-85f4-21b67d141823\") " pod="openshift-must-gather-gc5r4/crc-debug-qd7bw" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.656078 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77jcf\" (UniqueName: \"kubernetes.io/projected/23fd8bb3-fcb4-47bf-85f4-21b67d141823-kube-api-access-77jcf\") pod \"crc-debug-qd7bw\" (UID: \"23fd8bb3-fcb4-47bf-85f4-21b67d141823\") " pod="openshift-must-gather-gc5r4/crc-debug-qd7bw" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.656400 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23fd8bb3-fcb4-47bf-85f4-21b67d141823-host\") pod \"crc-debug-qd7bw\" (UID: \"23fd8bb3-fcb4-47bf-85f4-21b67d141823\") " pod="openshift-must-gather-gc5r4/crc-debug-qd7bw" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.656505 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23fd8bb3-fcb4-47bf-85f4-21b67d141823-host\") pod \"crc-debug-qd7bw\" (UID: \"23fd8bb3-fcb4-47bf-85f4-21b67d141823\") " pod="openshift-must-gather-gc5r4/crc-debug-qd7bw" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.674065 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77jcf\" (UniqueName: \"kubernetes.io/projected/23fd8bb3-fcb4-47bf-85f4-21b67d141823-kube-api-access-77jcf\") pod \"crc-debug-qd7bw\" (UID: \"23fd8bb3-fcb4-47bf-85f4-21b67d141823\") " pod="openshift-must-gather-gc5r4/crc-debug-qd7bw" Dec 05 21:11:36 crc kubenswrapper[4885]: I1205 21:11:36.862915 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc5r4/crc-debug-qd7bw" Dec 05 21:11:36 crc kubenswrapper[4885]: W1205 21:11:36.905897 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23fd8bb3_fcb4_47bf_85f4_21b67d141823.slice/crio-46a1b2c2e76a86f93d2c436f4bd407e85d10cf24ab3220066a66cef9ec0a2497 WatchSource:0}: Error finding container 46a1b2c2e76a86f93d2c436f4bd407e85d10cf24ab3220066a66cef9ec0a2497: Status 404 returned error can't find the container with id 46a1b2c2e76a86f93d2c436f4bd407e85d10cf24ab3220066a66cef9ec0a2497 Dec 05 21:11:37 crc kubenswrapper[4885]: I1205 21:11:37.183096 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1399560d-edca-44ef-98ca-a252533542f1" path="/var/lib/kubelet/pods/1399560d-edca-44ef-98ca-a252533542f1/volumes" Dec 05 21:11:37 crc kubenswrapper[4885]: I1205 21:11:37.209484 4885 generic.go:334] "Generic (PLEG): container finished" podID="23fd8bb3-fcb4-47bf-85f4-21b67d141823" containerID="0f85a23d416af9f4ed3bc27916ed6788df7b033ff44a8f748c949f67210ddba7" exitCode=0 Dec 05 21:11:37 crc kubenswrapper[4885]: I1205 21:11:37.209528 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gc5r4/crc-debug-qd7bw" event={"ID":"23fd8bb3-fcb4-47bf-85f4-21b67d141823","Type":"ContainerDied","Data":"0f85a23d416af9f4ed3bc27916ed6788df7b033ff44a8f748c949f67210ddba7"} Dec 05 21:11:37 crc kubenswrapper[4885]: I1205 21:11:37.209554 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gc5r4/crc-debug-qd7bw" event={"ID":"23fd8bb3-fcb4-47bf-85f4-21b67d141823","Type":"ContainerStarted","Data":"46a1b2c2e76a86f93d2c436f4bd407e85d10cf24ab3220066a66cef9ec0a2497"} Dec 05 21:11:37 crc kubenswrapper[4885]: I1205 21:11:37.643727 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gc5r4/crc-debug-qd7bw"] Dec 05 21:11:37 crc kubenswrapper[4885]: I1205 21:11:37.656800 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gc5r4/crc-debug-qd7bw"] Dec 05 21:11:38 crc kubenswrapper[4885]: I1205 21:11:38.320504 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc5r4/crc-debug-qd7bw" Dec 05 21:11:38 crc kubenswrapper[4885]: I1205 21:11:38.490967 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23fd8bb3-fcb4-47bf-85f4-21b67d141823-host\") pod \"23fd8bb3-fcb4-47bf-85f4-21b67d141823\" (UID: \"23fd8bb3-fcb4-47bf-85f4-21b67d141823\") " Dec 05 21:11:38 crc kubenswrapper[4885]: I1205 21:11:38.491103 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77jcf\" (UniqueName: \"kubernetes.io/projected/23fd8bb3-fcb4-47bf-85f4-21b67d141823-kube-api-access-77jcf\") pod \"23fd8bb3-fcb4-47bf-85f4-21b67d141823\" (UID: \"23fd8bb3-fcb4-47bf-85f4-21b67d141823\") " Dec 05 21:11:38 crc kubenswrapper[4885]: I1205 21:11:38.492192 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23fd8bb3-fcb4-47bf-85f4-21b67d141823-host" (OuterVolumeSpecName: "host") pod "23fd8bb3-fcb4-47bf-85f4-21b67d141823" (UID: "23fd8bb3-fcb4-47bf-85f4-21b67d141823"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:11:38 crc kubenswrapper[4885]: I1205 21:11:38.498898 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23fd8bb3-fcb4-47bf-85f4-21b67d141823-kube-api-access-77jcf" (OuterVolumeSpecName: "kube-api-access-77jcf") pod "23fd8bb3-fcb4-47bf-85f4-21b67d141823" (UID: "23fd8bb3-fcb4-47bf-85f4-21b67d141823"). InnerVolumeSpecName "kube-api-access-77jcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:11:38 crc kubenswrapper[4885]: I1205 21:11:38.592872 4885 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23fd8bb3-fcb4-47bf-85f4-21b67d141823-host\") on node \"crc\" DevicePath \"\"" Dec 05 21:11:38 crc kubenswrapper[4885]: I1205 21:11:38.592911 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77jcf\" (UniqueName: \"kubernetes.io/projected/23fd8bb3-fcb4-47bf-85f4-21b67d141823-kube-api-access-77jcf\") on node \"crc\" DevicePath \"\"" Dec 05 21:11:38 crc kubenswrapper[4885]: I1205 21:11:38.822242 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gc5r4/crc-debug-8ttc9"] Dec 05 21:11:38 crc kubenswrapper[4885]: E1205 21:11:38.822936 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fd8bb3-fcb4-47bf-85f4-21b67d141823" containerName="container-00" Dec 05 21:11:38 crc kubenswrapper[4885]: I1205 21:11:38.822948 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fd8bb3-fcb4-47bf-85f4-21b67d141823" containerName="container-00" Dec 05 21:11:38 crc kubenswrapper[4885]: I1205 21:11:38.823155 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="23fd8bb3-fcb4-47bf-85f4-21b67d141823" containerName="container-00" Dec 05 21:11:38 crc kubenswrapper[4885]: I1205 21:11:38.823723 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc5r4/crc-debug-8ttc9" Dec 05 21:11:38 crc kubenswrapper[4885]: I1205 21:11:38.896898 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7dt8\" (UniqueName: \"kubernetes.io/projected/723133c6-d80e-4dd9-94d1-a892ed270483-kube-api-access-w7dt8\") pod \"crc-debug-8ttc9\" (UID: \"723133c6-d80e-4dd9-94d1-a892ed270483\") " pod="openshift-must-gather-gc5r4/crc-debug-8ttc9" Dec 05 21:11:38 crc kubenswrapper[4885]: I1205 21:11:38.897068 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/723133c6-d80e-4dd9-94d1-a892ed270483-host\") pod \"crc-debug-8ttc9\" (UID: \"723133c6-d80e-4dd9-94d1-a892ed270483\") " pod="openshift-must-gather-gc5r4/crc-debug-8ttc9" Dec 05 21:11:38 crc kubenswrapper[4885]: I1205 21:11:38.997926 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7dt8\" (UniqueName: \"kubernetes.io/projected/723133c6-d80e-4dd9-94d1-a892ed270483-kube-api-access-w7dt8\") pod \"crc-debug-8ttc9\" (UID: \"723133c6-d80e-4dd9-94d1-a892ed270483\") " pod="openshift-must-gather-gc5r4/crc-debug-8ttc9" Dec 05 21:11:38 crc kubenswrapper[4885]: I1205 21:11:38.998067 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/723133c6-d80e-4dd9-94d1-a892ed270483-host\") pod \"crc-debug-8ttc9\" (UID: \"723133c6-d80e-4dd9-94d1-a892ed270483\") " pod="openshift-must-gather-gc5r4/crc-debug-8ttc9" Dec 05 21:11:38 crc kubenswrapper[4885]: I1205 21:11:38.998200 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/723133c6-d80e-4dd9-94d1-a892ed270483-host\") pod \"crc-debug-8ttc9\" (UID: \"723133c6-d80e-4dd9-94d1-a892ed270483\") " pod="openshift-must-gather-gc5r4/crc-debug-8ttc9" Dec 05 21:11:39 crc kubenswrapper[4885]: I1205 21:11:39.018529 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7dt8\" (UniqueName: \"kubernetes.io/projected/723133c6-d80e-4dd9-94d1-a892ed270483-kube-api-access-w7dt8\") pod \"crc-debug-8ttc9\" (UID: \"723133c6-d80e-4dd9-94d1-a892ed270483\") " pod="openshift-must-gather-gc5r4/crc-debug-8ttc9" Dec 05 21:11:39 crc kubenswrapper[4885]: I1205 21:11:39.141216 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc5r4/crc-debug-8ttc9" Dec 05 21:11:39 crc kubenswrapper[4885]: W1205 21:11:39.180211 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod723133c6_d80e_4dd9_94d1_a892ed270483.slice/crio-d1bdbb93c93747c85a0513139da62515b3fee8545691ba29c75e0ba00218813a WatchSource:0}: Error finding container d1bdbb93c93747c85a0513139da62515b3fee8545691ba29c75e0ba00218813a: Status 404 returned error can't find the container with id d1bdbb93c93747c85a0513139da62515b3fee8545691ba29c75e0ba00218813a Dec 05 21:11:39 crc kubenswrapper[4885]: I1205 21:11:39.192861 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23fd8bb3-fcb4-47bf-85f4-21b67d141823" path="/var/lib/kubelet/pods/23fd8bb3-fcb4-47bf-85f4-21b67d141823/volumes" Dec 05 21:11:39 crc kubenswrapper[4885]: I1205 21:11:39.228284 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gc5r4/crc-debug-8ttc9" event={"ID":"723133c6-d80e-4dd9-94d1-a892ed270483","Type":"ContainerStarted","Data":"d1bdbb93c93747c85a0513139da62515b3fee8545691ba29c75e0ba00218813a"} Dec 05 21:11:39 crc kubenswrapper[4885]: I1205 21:11:39.232145 4885 scope.go:117] "RemoveContainer" containerID="0f85a23d416af9f4ed3bc27916ed6788df7b033ff44a8f748c949f67210ddba7" Dec 05 21:11:39 crc kubenswrapper[4885]: I1205 21:11:39.232221 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc5r4/crc-debug-qd7bw" Dec 05 21:11:40 crc kubenswrapper[4885]: I1205 21:11:40.245685 4885 generic.go:334] "Generic (PLEG): container finished" podID="723133c6-d80e-4dd9-94d1-a892ed270483" containerID="3b88bd2f569b81786886bc6841aa0a7e755737c627f1dcdc465acb9d6243cce5" exitCode=0 Dec 05 21:11:40 crc kubenswrapper[4885]: I1205 21:11:40.246125 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gc5r4/crc-debug-8ttc9" event={"ID":"723133c6-d80e-4dd9-94d1-a892ed270483","Type":"ContainerDied","Data":"3b88bd2f569b81786886bc6841aa0a7e755737c627f1dcdc465acb9d6243cce5"} Dec 05 21:11:40 crc kubenswrapper[4885]: I1205 21:11:40.292520 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gc5r4/crc-debug-8ttc9"] Dec 05 21:11:40 crc kubenswrapper[4885]: I1205 21:11:40.301653 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gc5r4/crc-debug-8ttc9"] Dec 05 21:11:41 crc kubenswrapper[4885]: I1205 21:11:41.366997 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc5r4/crc-debug-8ttc9" Dec 05 21:11:41 crc kubenswrapper[4885]: I1205 21:11:41.550613 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7dt8\" (UniqueName: \"kubernetes.io/projected/723133c6-d80e-4dd9-94d1-a892ed270483-kube-api-access-w7dt8\") pod \"723133c6-d80e-4dd9-94d1-a892ed270483\" (UID: \"723133c6-d80e-4dd9-94d1-a892ed270483\") " Dec 05 21:11:41 crc kubenswrapper[4885]: I1205 21:11:41.550731 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/723133c6-d80e-4dd9-94d1-a892ed270483-host\") pod \"723133c6-d80e-4dd9-94d1-a892ed270483\" (UID: \"723133c6-d80e-4dd9-94d1-a892ed270483\") " Dec 05 21:11:41 crc kubenswrapper[4885]: I1205 21:11:41.550778 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/723133c6-d80e-4dd9-94d1-a892ed270483-host" (OuterVolumeSpecName: "host") pod "723133c6-d80e-4dd9-94d1-a892ed270483" (UID: "723133c6-d80e-4dd9-94d1-a892ed270483"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:11:41 crc kubenswrapper[4885]: I1205 21:11:41.551238 4885 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/723133c6-d80e-4dd9-94d1-a892ed270483-host\") on node \"crc\" DevicePath \"\"" Dec 05 21:11:41 crc kubenswrapper[4885]: I1205 21:11:41.557240 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/723133c6-d80e-4dd9-94d1-a892ed270483-kube-api-access-w7dt8" (OuterVolumeSpecName: "kube-api-access-w7dt8") pod "723133c6-d80e-4dd9-94d1-a892ed270483" (UID: "723133c6-d80e-4dd9-94d1-a892ed270483"). InnerVolumeSpecName "kube-api-access-w7dt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:11:41 crc kubenswrapper[4885]: I1205 21:11:41.654068 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7dt8\" (UniqueName: \"kubernetes.io/projected/723133c6-d80e-4dd9-94d1-a892ed270483-kube-api-access-w7dt8\") on node \"crc\" DevicePath \"\"" Dec 05 21:11:42 crc kubenswrapper[4885]: I1205 21:11:42.270099 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc5r4/crc-debug-8ttc9" Dec 05 21:11:42 crc kubenswrapper[4885]: I1205 21:11:42.279333 4885 scope.go:117] "RemoveContainer" containerID="3b88bd2f569b81786886bc6841aa0a7e755737c627f1dcdc465acb9d6243cce5" Dec 05 21:11:43 crc kubenswrapper[4885]: I1205 21:11:43.183543 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="723133c6-d80e-4dd9-94d1-a892ed270483" path="/var/lib/kubelet/pods/723133c6-d80e-4dd9-94d1-a892ed270483/volumes" Dec 05 21:12:06 crc kubenswrapper[4885]: I1205 21:12:06.185784 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-787956bb96-gzkln_8af09c24-8a48-47cc-ad7c-1778f9a27547/barbican-api/0.log" Dec 05 21:12:06 crc kubenswrapper[4885]: I1205 21:12:06.305137 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-787956bb96-gzkln_8af09c24-8a48-47cc-ad7c-1778f9a27547/barbican-api-log/0.log" Dec 05 21:12:06 crc kubenswrapper[4885]: I1205 21:12:06.371236 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f44776d88-2k4qb_84964967-0f37-47c8-919f-3a68040a1d36/barbican-keystone-listener/0.log" Dec 05 21:12:06 crc kubenswrapper[4885]: I1205 21:12:06.446899 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f44776d88-2k4qb_84964967-0f37-47c8-919f-3a68040a1d36/barbican-keystone-listener-log/0.log" Dec 05 21:12:06 crc kubenswrapper[4885]: I1205 21:12:06.530920 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-549d6dd897-jd542_f10197ca-8886-4668-b3e8-1179bdb7041d/barbican-worker/0.log" Dec 05 21:12:06 crc kubenswrapper[4885]: I1205 21:12:06.582542 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-549d6dd897-jd542_f10197ca-8886-4668-b3e8-1179bdb7041d/barbican-worker-log/0.log" Dec 05 21:12:06 crc kubenswrapper[4885]: I1205 21:12:06.731355 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-8p52d_54bae71b-4af1-49b5-a41b-58e6aafd26ca/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:12:06 crc kubenswrapper[4885]: I1205 21:12:06.814400 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a72398e-830b-402b-83c9-4ea93aa05c76/ceilometer-central-agent/0.log" Dec 05 21:12:06 crc kubenswrapper[4885]: I1205 21:12:06.857044 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a72398e-830b-402b-83c9-4ea93aa05c76/ceilometer-notification-agent/0.log" Dec 05 21:12:06 crc kubenswrapper[4885]: I1205 21:12:06.920072 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a72398e-830b-402b-83c9-4ea93aa05c76/proxy-httpd/0.log" Dec 05 21:12:06 crc kubenswrapper[4885]: I1205 21:12:06.978152 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a72398e-830b-402b-83c9-4ea93aa05c76/sg-core/0.log" Dec 05 21:12:07 crc kubenswrapper[4885]: I1205 21:12:07.135958 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_232e06c4-ecaf-4959-b1e2-0c183f6afb64/cinder-api-log/0.log" Dec 05 21:12:07 crc kubenswrapper[4885]: I1205 21:12:07.143152 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_232e06c4-ecaf-4959-b1e2-0c183f6afb64/cinder-api/0.log" Dec 05 21:12:07 crc kubenswrapper[4885]: I1205 21:12:07.306478 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_85ff2041-1a3f-46c9-ba86-9440a4c1e129/cinder-scheduler/0.log" Dec 05 21:12:07 crc kubenswrapper[4885]: I1205 21:12:07.453370 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-nvh26_cf7e7e25-a243-4caf-8b1a-34c1830a097e/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:12:07 crc kubenswrapper[4885]: I1205 21:12:07.458820 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_85ff2041-1a3f-46c9-ba86-9440a4c1e129/probe/0.log" Dec 05 21:12:07 crc kubenswrapper[4885]: I1205 21:12:07.637038 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-klnxp_9487fa66-920b-41fc-beb6-4dffcb4a898a/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:12:07 crc kubenswrapper[4885]: I1205 21:12:07.640557 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78f49d79c7-7qk6g_2bb6d6a7-1ca1-4089-91e9-f8641f2f262e/init/0.log" Dec 05 21:12:07 crc kubenswrapper[4885]: I1205 21:12:07.897064 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78f49d79c7-7qk6g_2bb6d6a7-1ca1-4089-91e9-f8641f2f262e/dnsmasq-dns/0.log" Dec 05 21:12:07 crc kubenswrapper[4885]: I1205 21:12:07.921632 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78f49d79c7-7qk6g_2bb6d6a7-1ca1-4089-91e9-f8641f2f262e/init/0.log" Dec 05 21:12:07 crc kubenswrapper[4885]: I1205 21:12:07.937960 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-6wgq9_a16820a2-be4e-45d6-bcef-91810571b95f/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:12:08 crc kubenswrapper[4885]: I1205 21:12:08.257338 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c88a6c22-ae9a-4d43-9a63-e6ea351eb012/glance-log/0.log" Dec 05 21:12:08 crc kubenswrapper[4885]: I1205 21:12:08.296740 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c88a6c22-ae9a-4d43-9a63-e6ea351eb012/glance-httpd/0.log" Dec 05 21:12:08 crc kubenswrapper[4885]: I1205 21:12:08.455607 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d/glance-httpd/0.log" Dec 05 21:12:08 crc kubenswrapper[4885]: I1205 21:12:08.486536 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_44f2a77b-c92f-4e65-b6f2-e5cfffcaaa6d/glance-log/0.log" Dec 05 21:12:08 crc kubenswrapper[4885]: I1205 21:12:08.647652 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7d9999949d-c22ch_d0f84b71-1907-4f71-833d-1e5561a4f0f8/horizon/0.log" Dec 05 21:12:08 crc kubenswrapper[4885]: I1205 21:12:08.799727 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-pqprh_9c9ed39f-ee5e-4c66-8171-488ed01847db/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:12:09 crc kubenswrapper[4885]: I1205 21:12:09.002233 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-d6c5z_d0a9ab2d-1012-41ba-b810-c7f7f127330e/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:12:09 crc kubenswrapper[4885]: I1205 21:12:09.025007 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7d9999949d-c22ch_d0f84b71-1907-4f71-833d-1e5561a4f0f8/horizon-log/0.log" Dec 05 21:12:09 crc kubenswrapper[4885]: I1205 21:12:09.260187 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7bdf6f4c4b-9n2vm_a8ffb925-d20c-4c24-a3b2-158d9c347b6b/keystone-api/0.log" Dec 05 21:12:09 crc kubenswrapper[4885]: I1205 21:12:09.276450 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29416141-4x6cq_39c99e0c-b27c-4703-a5c0-a380c33df665/keystone-cron/0.log" Dec 05 21:12:09 crc kubenswrapper[4885]: I1205 21:12:09.382147 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_34d68d6f-5309-4dd5-b361-811ddff64379/kube-state-metrics/0.log" Dec 05 21:12:09 crc kubenswrapper[4885]: I1205 21:12:09.492035 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-wqzbt_7b51c87e-b603-43e2-bb06-a8e9a0416a59/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:12:10 crc kubenswrapper[4885]: I1205 21:12:10.240981 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-94b44cc8f-5tpnj_0437ab7b-cd9d-46e8-9bca-7acdbefda1be/neutron-api/0.log" Dec 05 21:12:10 crc kubenswrapper[4885]: I1205 21:12:10.241531 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-94b44cc8f-5tpnj_0437ab7b-cd9d-46e8-9bca-7acdbefda1be/neutron-httpd/0.log" Dec 05 21:12:10 crc kubenswrapper[4885]: I1205 21:12:10.361923 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-s879d_525d9ebb-07fb-41b7-9059-d609ed9cac0e/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:12:10 crc kubenswrapper[4885]: I1205 21:12:10.750874 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_1e275487-025f-4c31-a7f4-267b05218da9/nova-api-log/0.log" Dec 05 21:12:10 crc kubenswrapper[4885]: I1205 21:12:10.895483 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b8f6973a-6753-4845-a273-798f031cf4d6/nova-cell0-conductor-conductor/0.log" Dec 05 21:12:11 crc kubenswrapper[4885]: I1205 21:12:11.128752 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_0d7ef835-7090-43c0-b489-8e1adc41fd47/nova-cell1-conductor-conductor/0.log" Dec 05 21:12:11 crc kubenswrapper[4885]: I1205 21:12:11.181276 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_1e275487-025f-4c31-a7f4-267b05218da9/nova-api-api/0.log" Dec 05 21:12:11 crc kubenswrapper[4885]: I1205 21:12:11.240072 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d5627a8a-d602-4c23-bb2f-e07f9c2a8681/nova-cell1-novncproxy-novncproxy/0.log" Dec 05 21:12:11 crc kubenswrapper[4885]: I1205 21:12:11.364305 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-9j89h_453597ee-fc9f-4fc6-beb2-e4c75e1236db/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:12:11 crc kubenswrapper[4885]: I1205 21:12:11.549095 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_41069d3f-c9d5-4278-8171-cebf5434937e/nova-metadata-log/0.log" Dec 05 21:12:11 crc kubenswrapper[4885]: I1205 21:12:11.871705 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_0d49d4cd-955c-41c7-8df0-63b364cb3e2d/nova-scheduler-scheduler/0.log" Dec 05 21:12:11 crc kubenswrapper[4885]: I1205 21:12:11.879532 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_93184776-73bf-4ff3-9f7f-66b46fd511ed/mysql-bootstrap/0.log" Dec 05 21:12:11 crc kubenswrapper[4885]: I1205 21:12:11.981746 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_93184776-73bf-4ff3-9f7f-66b46fd511ed/mysql-bootstrap/0.log" Dec 05 21:12:12 crc kubenswrapper[4885]: I1205 21:12:12.122004 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_93184776-73bf-4ff3-9f7f-66b46fd511ed/galera/0.log" Dec 05 21:12:12 crc kubenswrapper[4885]: I1205 21:12:12.222335 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3e1a8619-8184-43c1-9444-8e86fbc4213d/mysql-bootstrap/0.log" Dec 05 21:12:12 crc kubenswrapper[4885]: I1205 21:12:12.442062 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3e1a8619-8184-43c1-9444-8e86fbc4213d/mysql-bootstrap/0.log" Dec 05 21:12:12 crc kubenswrapper[4885]: I1205 21:12:12.498814 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3e1a8619-8184-43c1-9444-8e86fbc4213d/galera/0.log" Dec 05 21:12:12 crc kubenswrapper[4885]: I1205 21:12:12.620866 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_60b132f9-5036-44cd-8d19-e60a39760da0/openstackclient/0.log" Dec 05 21:12:12 crc kubenswrapper[4885]: I1205 21:12:12.762164 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-z7wfg_28451893-15ed-4dc1-a6ef-f93fed27316e/openstack-network-exporter/0.log" Dec 05 21:12:12 crc kubenswrapper[4885]: I1205 21:12:12.887301 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_41069d3f-c9d5-4278-8171-cebf5434937e/nova-metadata-metadata/0.log" Dec 05 21:12:12 crc kubenswrapper[4885]: I1205 21:12:12.957807 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hgth4_32c5b9a2-f65e-4223-ac3f-f49a4e160454/ovsdb-server-init/0.log" Dec 05 21:12:13 crc kubenswrapper[4885]: I1205 21:12:13.104523 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hgth4_32c5b9a2-f65e-4223-ac3f-f49a4e160454/ovs-vswitchd/0.log" Dec 05 21:12:13 crc kubenswrapper[4885]: I1205 21:12:13.139656 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hgth4_32c5b9a2-f65e-4223-ac3f-f49a4e160454/ovsdb-server/0.log" Dec 05 21:12:13 crc kubenswrapper[4885]: I1205 21:12:13.140755 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hgth4_32c5b9a2-f65e-4223-ac3f-f49a4e160454/ovsdb-server-init/0.log" Dec 05 21:12:13 crc kubenswrapper[4885]: I1205 21:12:13.292417 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ptwvl_0bcbe5dc-19cf-4412-ab48-1d2c3cebbf99/ovn-controller/0.log" Dec 05 21:12:13 crc kubenswrapper[4885]: I1205 21:12:13.364359 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-m548j_de5ebae2-9fe8-4b8a-ab85-60226fa56525/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:12:13 crc kubenswrapper[4885]: I1205 21:12:13.681288 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2cf8581f-1009-4a26-9642-4e154e83dbc1/ovn-northd/0.log" Dec 05 21:12:13 crc kubenswrapper[4885]: I1205 21:12:13.683510 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2cf8581f-1009-4a26-9642-4e154e83dbc1/openstack-network-exporter/0.log" Dec 05 21:12:13 crc kubenswrapper[4885]: I1205 21:12:13.853781 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_91d9cbb0-7966-411b-86e4-b80882da454e/openstack-network-exporter/0.log" Dec 05 21:12:13 crc kubenswrapper[4885]: I1205 21:12:13.938143 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_91d9cbb0-7966-411b-86e4-b80882da454e/ovsdbserver-nb/0.log" Dec 05 21:12:13 crc kubenswrapper[4885]: I1205 21:12:13.977555 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7edaf8ab-283b-46bc-89e2-a3c8f681624b/openstack-network-exporter/0.log" Dec 05 21:12:14 crc kubenswrapper[4885]: I1205 21:12:14.073233 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7edaf8ab-283b-46bc-89e2-a3c8f681624b/ovsdbserver-sb/0.log" Dec 05 21:12:14 crc kubenswrapper[4885]: I1205 21:12:14.199474 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d98fd5798-8jhxf_eca7ccc4-d1ff-402c-9fe8-0c61746d41d1/placement-api/0.log" Dec 05 21:12:14 crc kubenswrapper[4885]: I1205 21:12:14.293014 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d98fd5798-8jhxf_eca7ccc4-d1ff-402c-9fe8-0c61746d41d1/placement-log/0.log" Dec 05 21:12:14 crc kubenswrapper[4885]: I1205 21:12:14.396637 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_38cec51a-a7b6-420f-8efe-f21b3acf2f3f/setup-container/0.log" Dec 05 21:12:14 crc kubenswrapper[4885]: I1205 21:12:14.604559 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_38cec51a-a7b6-420f-8efe-f21b3acf2f3f/setup-container/0.log" Dec 05 21:12:14 crc kubenswrapper[4885]: I1205 21:12:14.680763 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_38cec51a-a7b6-420f-8efe-f21b3acf2f3f/rabbitmq/0.log" Dec 05 21:12:14 crc kubenswrapper[4885]: I1205 21:12:14.724037 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cdc87c63-a124-485c-8f34-016d17a58f29/setup-container/0.log" Dec 05 21:12:14 crc kubenswrapper[4885]: I1205 21:12:14.905374 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cdc87c63-a124-485c-8f34-016d17a58f29/rabbitmq/0.log" Dec 05 21:12:14 crc kubenswrapper[4885]: I1205 21:12:14.946135 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cdc87c63-a124-485c-8f34-016d17a58f29/setup-container/0.log" Dec 05 21:12:14 crc kubenswrapper[4885]: I1205 21:12:14.970070 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9h9vk_b27a1f4c-ba65-4b22-885a-e642064f7c27/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:12:15 crc kubenswrapper[4885]: I1205 21:12:15.140005 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-fdzlz_a40c582a-e811-4e60-a7fe-1bf467d32e96/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:12:15 crc kubenswrapper[4885]: I1205 21:12:15.215881 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-7dc92_489dbc8e-e2ca-41aa-9e48-ca81bea02758/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:12:15 crc kubenswrapper[4885]: I1205 21:12:15.432552 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-ptcp8_59678b29-6ffe-4d18-a8bb-8bf4717f9b10/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:12:15 crc kubenswrapper[4885]: I1205 21:12:15.433601 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9jdng_8f98cb4a-349f-443b-aab3-686a3d0bcc67/ssh-known-hosts-edpm-deployment/0.log" Dec 05 21:12:15 crc kubenswrapper[4885]: I1205 21:12:15.709337 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-56b6f678f7-nt7kq_5df6ff8a-e66c-402d-a7cd-63125b9c6cae/proxy-server/0.log" Dec 05 21:12:15 crc kubenswrapper[4885]: I1205 21:12:15.787595 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-56b6f678f7-nt7kq_5df6ff8a-e66c-402d-a7cd-63125b9c6cae/proxy-httpd/0.log" Dec 05 21:12:15 crc kubenswrapper[4885]: I1205 21:12:15.845594 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-2j6cb_c5c452f6-0d03-4e67-bab0-0dcb1926f523/swift-ring-rebalance/0.log" Dec 05 21:12:15 crc kubenswrapper[4885]: I1205 21:12:15.936104 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/account-auditor/0.log" Dec 05 21:12:15 crc kubenswrapper[4885]: I1205 21:12:15.988155 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/account-reaper/0.log" Dec 05 21:12:16 crc kubenswrapper[4885]: I1205 21:12:16.095456 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/account-replicator/0.log" Dec 05 21:12:16 crc kubenswrapper[4885]: I1205 21:12:16.153339 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/account-server/0.log" Dec 05 21:12:16 crc kubenswrapper[4885]: I1205 21:12:16.162139 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/container-auditor/0.log" Dec 05 21:12:16 crc kubenswrapper[4885]: I1205 21:12:16.249268 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/container-replicator/0.log" Dec 05 21:12:16 crc kubenswrapper[4885]: I1205 21:12:16.327885 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/container-server/0.log" Dec 05 21:12:16 crc kubenswrapper[4885]: I1205 21:12:16.400297 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/container-updater/0.log" Dec 05 21:12:16 crc kubenswrapper[4885]: I1205 21:12:16.469449 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/object-expirer/0.log" Dec 05 21:12:16 crc kubenswrapper[4885]: I1205 21:12:16.494816 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/object-auditor/0.log" Dec 05 21:12:16 crc kubenswrapper[4885]: I1205 21:12:16.596086 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/object-replicator/0.log" Dec 05 21:12:16 crc kubenswrapper[4885]: I1205 21:12:16.598780 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/object-server/0.log" Dec 05 21:12:16 crc kubenswrapper[4885]: I1205 21:12:16.679695 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/object-updater/0.log" Dec 05 21:12:16 crc kubenswrapper[4885]: I1205 21:12:16.722753 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/rsync/0.log" Dec 05 21:12:16 crc kubenswrapper[4885]: I1205 21:12:16.819267 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18b127df-3095-45b6-b347-f1906d6317fe/swift-recon-cron/0.log" Dec 05 21:12:16 crc kubenswrapper[4885]: I1205 21:12:16.957192 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-5m26m_d6e72054-a861-40ce-b2c9-6212896baaf4/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:12:17 crc kubenswrapper[4885]: I1205 21:12:17.064130 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_9f679f95-52b0-4cdd-a9f2-f7dcd5f23d2d/tempest-tests-tempest-tests-runner/0.log" Dec 05 21:12:17 crc kubenswrapper[4885]: I1205 21:12:17.156792 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_8faf97ea-3453-4e96-8a29-a7a30aec54c1/test-operator-logs-container/0.log" Dec 05 21:12:17 crc kubenswrapper[4885]: I1205 21:12:17.626517 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-xtv4q_f6fcaa99-97aa-46d8-be19-5cac454e2f77/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 21:12:25 crc kubenswrapper[4885]: I1205 21:12:25.392720 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_12c40607-2770-4b97-95f1-6ac26280d337/memcached/0.log" Dec 05 21:12:43 crc kubenswrapper[4885]: I1205 21:12:43.503101 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c_9339b513-f7aa-4ad6-9e87-b585e81c0577/util/0.log" Dec 05 21:12:43 crc kubenswrapper[4885]: I1205 21:12:43.702797 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c_9339b513-f7aa-4ad6-9e87-b585e81c0577/util/0.log" Dec 05 21:12:43 crc kubenswrapper[4885]: I1205 21:12:43.704751 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c_9339b513-f7aa-4ad6-9e87-b585e81c0577/pull/0.log" Dec 05 21:12:43 crc kubenswrapper[4885]: I1205 21:12:43.722352 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c_9339b513-f7aa-4ad6-9e87-b585e81c0577/pull/0.log" Dec 05 21:12:43 crc kubenswrapper[4885]: I1205 21:12:43.868110 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c_9339b513-f7aa-4ad6-9e87-b585e81c0577/util/0.log" Dec 05 21:12:43 crc kubenswrapper[4885]: I1205 21:12:43.871442 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c_9339b513-f7aa-4ad6-9e87-b585e81c0577/extract/0.log" Dec 05 21:12:43 crc kubenswrapper[4885]: I1205 21:12:43.884274 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafsfq2c_9339b513-f7aa-4ad6-9e87-b585e81c0577/pull/0.log" Dec 05 21:12:44 crc kubenswrapper[4885]: I1205 21:12:44.050279 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-cqj46_74869c39-a4c4-4812-8656-4751d25ef987/kube-rbac-proxy/0.log" Dec 05 21:12:44 crc kubenswrapper[4885]: I1205 21:12:44.104198 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-cqj46_74869c39-a4c4-4812-8656-4751d25ef987/manager/0.log" Dec 05 21:12:44 crc kubenswrapper[4885]: I1205 21:12:44.119789 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-s4ftd_93741f1b-6823-4374-927f-38d95ba139f5/kube-rbac-proxy/0.log" Dec 05 21:12:44 crc kubenswrapper[4885]: I1205 21:12:44.265846 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-nqshj_6a0f526a-c496-478e-bc4c-e6478ebeb3ea/kube-rbac-proxy/0.log" Dec 05 21:12:44 crc kubenswrapper[4885]: I1205 21:12:44.298049 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-nqshj_6a0f526a-c496-478e-bc4c-e6478ebeb3ea/manager/0.log" Dec 05 21:12:44 crc kubenswrapper[4885]: I1205 21:12:44.303269 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-s4ftd_93741f1b-6823-4374-927f-38d95ba139f5/manager/0.log" Dec 05 21:12:44 crc kubenswrapper[4885]: I1205 21:12:44.421775 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-rqh2l_c942221f-6ad2-4109-9975-ec8054686283/kube-rbac-proxy/0.log" Dec 05 21:12:44 crc kubenswrapper[4885]: I1205 21:12:44.534098 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-rqh2l_c942221f-6ad2-4109-9975-ec8054686283/manager/0.log" Dec 05 21:12:44 crc kubenswrapper[4885]: I1205 21:12:44.629139 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-kgdg2_9034e951-dbbb-4927-b9fa-fa2e83c1595c/manager/0.log" Dec 05 21:12:44 crc kubenswrapper[4885]: I1205 21:12:44.641638 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-kgdg2_9034e951-dbbb-4927-b9fa-fa2e83c1595c/kube-rbac-proxy/0.log" Dec 05 21:12:44 crc kubenswrapper[4885]: I1205 21:12:44.700366 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-zz7df_ee66e99c-4761-43a5-a55c-b28957859913/kube-rbac-proxy/0.log" Dec 05 21:12:44 crc kubenswrapper[4885]: I1205 21:12:44.807579 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-zz7df_ee66e99c-4761-43a5-a55c-b28957859913/manager/0.log" Dec 05 21:12:44 crc kubenswrapper[4885]: I1205 21:12:44.891424 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-dpqcg_f9775930-6d69-4ad4-a249-f5d2f270b365/kube-rbac-proxy/0.log" Dec 05 21:12:45 crc kubenswrapper[4885]: I1205 21:12:45.064687 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-dpqcg_f9775930-6d69-4ad4-a249-f5d2f270b365/manager/0.log" Dec 05 21:12:45 crc kubenswrapper[4885]: I1205 21:12:45.102792 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-z27c2_741c1713-f931-471e-ad95-99d16600ab76/kube-rbac-proxy/0.log" Dec 05 21:12:45 crc kubenswrapper[4885]: I1205 21:12:45.125112 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-z27c2_741c1713-f931-471e-ad95-99d16600ab76/manager/0.log" Dec 05 21:12:45 crc kubenswrapper[4885]: I1205 21:12:45.308348 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-r6ljq_da47cf7f-37ab-4d5d-99b1-1b312002f83e/manager/0.log" Dec 05 21:12:45 crc kubenswrapper[4885]: I1205 21:12:45.330002 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-r6ljq_da47cf7f-37ab-4d5d-99b1-1b312002f83e/kube-rbac-proxy/0.log" Dec 05 21:12:45 crc kubenswrapper[4885]: I1205 21:12:45.482267 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-4vb99_ca2be922-afb3-4640-bdad-cfd3b0164d52/kube-rbac-proxy/0.log" Dec 05 21:12:45 crc kubenswrapper[4885]: I1205 21:12:45.497250 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-4vb99_ca2be922-afb3-4640-bdad-cfd3b0164d52/manager/0.log" Dec 05 21:12:45 crc kubenswrapper[4885]: I1205 21:12:45.556259 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-hkw2j_e12a10c6-f52c-4348-bb54-356af7632dd4/kube-rbac-proxy/0.log" Dec 05 21:12:45 crc kubenswrapper[4885]: I1205 21:12:45.672178 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-hkw2j_e12a10c6-f52c-4348-bb54-356af7632dd4/manager/0.log" Dec 05 21:12:45 crc kubenswrapper[4885]: I1205 21:12:45.719362 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-z4wtk_33f07e6f-9ac8-461d-b455-ad634c2e255c/kube-rbac-proxy/0.log" Dec 05 21:12:45 crc kubenswrapper[4885]: I1205 21:12:45.782724 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-z4wtk_33f07e6f-9ac8-461d-b455-ad634c2e255c/manager/0.log" Dec 05 21:12:45 crc kubenswrapper[4885]: I1205 21:12:45.905131 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-w5c5m_3e2eaf31-e16e-4072-ae6b-a5c9eda46732/kube-rbac-proxy/0.log" Dec 05 21:12:45 crc kubenswrapper[4885]: I1205 21:12:45.994422 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-w5c5m_3e2eaf31-e16e-4072-ae6b-a5c9eda46732/manager/0.log" Dec 05 21:12:46 crc kubenswrapper[4885]: I1205 21:12:46.078533 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-gwtxz_aed37ead-6406-43f0-a6f5-4e8864935a58/kube-rbac-proxy/0.log" Dec 05 21:12:46 crc kubenswrapper[4885]: I1205 21:12:46.103786 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-gwtxz_aed37ead-6406-43f0-a6f5-4e8864935a58/manager/0.log" Dec 05 21:12:46 crc kubenswrapper[4885]: I1205 21:12:46.214703 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f5qfqlm_fdb3c987-9d79-4920-9b95-1be3a3dbc622/kube-rbac-proxy/0.log" Dec 05 21:12:46 crc kubenswrapper[4885]: I1205 21:12:46.249364 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f5qfqlm_fdb3c987-9d79-4920-9b95-1be3a3dbc622/manager/0.log" Dec 05 21:12:46 crc kubenswrapper[4885]: I1205 21:12:46.692002 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-55b6fb9447-qk2s7_15ce450d-0098-4b25-afd2-5bda05cfb5b0/operator/0.log" Dec 05 21:12:47 crc kubenswrapper[4885]: I1205 21:12:47.098934 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-jm7lc_0f1ef804-3daa-44e0-a978-f6edc8efab00/registry-server/0.log" Dec 05 21:12:47 crc kubenswrapper[4885]: I1205 21:12:47.290860 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-t4mch_06e1a4eb-c6cb-4146-b2f9-484c2e699a7e/kube-rbac-proxy/0.log" Dec 05 21:12:47 crc kubenswrapper[4885]: I1205 21:12:47.339305 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4q2vd_2eea8037-d11c-47ee-9bc9-67deafc20268/kube-rbac-proxy/0.log" Dec 05 21:12:47 crc kubenswrapper[4885]: I1205 21:12:47.343887 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-t4mch_06e1a4eb-c6cb-4146-b2f9-484c2e699a7e/manager/0.log" Dec 05 21:12:47 crc kubenswrapper[4885]: I1205 21:12:47.499509 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4q2vd_2eea8037-d11c-47ee-9bc9-67deafc20268/manager/0.log" Dec 05 21:12:47 crc kubenswrapper[4885]: I1205 21:12:47.615430 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-qpp7t_18cedf03-5e88-4513-b2cc-e364e749f219/operator/0.log" Dec 05 21:12:47 crc kubenswrapper[4885]: I1205 21:12:47.706570 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54bdf956c4-b47j2_acaad339-be87-48ab-aee8-7f4637190768/manager/0.log" Dec 05 21:12:47 crc kubenswrapper[4885]: I1205 21:12:47.717810 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-t4xtt_c20bdf47-2333-40eb-b5e1-4ad4ad32cdd5/kube-rbac-proxy/0.log" Dec 05 21:12:47 crc kubenswrapper[4885]: I1205 21:12:47.794579 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-rqs2p_f68526b5-c6b6-484e-b476-1e4c76ba71fd/kube-rbac-proxy/0.log" Dec 05 21:12:47 crc kubenswrapper[4885]: I1205 21:12:47.805694 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-t4xtt_c20bdf47-2333-40eb-b5e1-4ad4ad32cdd5/manager/0.log" Dec 05 21:12:47 crc kubenswrapper[4885]: I1205 21:12:47.924575 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-rqs2p_f68526b5-c6b6-484e-b476-1e4c76ba71fd/manager/0.log" Dec 05 21:12:47 crc kubenswrapper[4885]: I1205 21:12:47.979893 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-565xh_49b39782-af0e-4f86-89f4-96582b6a8336/kube-rbac-proxy/0.log" Dec 05 21:12:48 crc kubenswrapper[4885]: I1205 21:12:48.010151 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-565xh_49b39782-af0e-4f86-89f4-96582b6a8336/manager/0.log" Dec 05 21:12:48 crc kubenswrapper[4885]: I1205 21:12:48.088859 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-nrtkv_f9ccfa3f-a548-4e32-9318-b3f2cb19ccca/manager/0.log" Dec 05 21:12:48 crc kubenswrapper[4885]: I1205 21:12:48.116615 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-nrtkv_f9ccfa3f-a548-4e32-9318-b3f2cb19ccca/kube-rbac-proxy/0.log" Dec 05 21:13:08 crc kubenswrapper[4885]: I1205 21:13:08.017865 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hfsls_1ad3cb2f-89ef-4f6e-9d48-f3eb33e4581c/control-plane-machine-set-operator/0.log" Dec 05 21:13:08 crc kubenswrapper[4885]: I1205 21:13:08.171883 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vs7jr_24653880-7b0f-4174-ac74-5d13d99975e9/kube-rbac-proxy/0.log" Dec 05 21:13:08 crc kubenswrapper[4885]: I1205 21:13:08.175613 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vs7jr_24653880-7b0f-4174-ac74-5d13d99975e9/machine-api-operator/0.log" Dec 05 21:13:16 crc kubenswrapper[4885]: I1205 21:13:16.631340 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:13:16 crc kubenswrapper[4885]: I1205 21:13:16.631921 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:13:20 crc kubenswrapper[4885]: I1205 21:13:20.510950 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-z8hk7_c7c60e10-72a8-4031-8e22-2f7b2ccc720c/cert-manager-controller/0.log" Dec 05 21:13:20 crc kubenswrapper[4885]: I1205 21:13:20.685786 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-6th47_c3ccb845-eaa6-44fd-b7ea-4f3739516528/cert-manager-cainjector/0.log" Dec 05 21:13:20 crc kubenswrapper[4885]: I1205 21:13:20.770183 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-4swqf_21e6c715-7d1f-405a-9d66-8ac102a2e623/cert-manager-webhook/0.log" Dec 05 21:13:33 crc kubenswrapper[4885]: I1205 21:13:33.520764 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-vgfwc_72454e30-d40f-408d-93f6-c0cf1ce2f400/nmstate-console-plugin/0.log" Dec 05 21:13:33 crc kubenswrapper[4885]: I1205 21:13:33.688562 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-ndhsp_912fc0d4-121a-4073-9e85-a2277a5078d8/nmstate-handler/0.log" Dec 05 21:13:33 crc kubenswrapper[4885]: I1205 21:13:33.791287 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-lhpld_7e345d16-e7f9-4881-a031-eb5ef37e22b3/nmstate-metrics/0.log" Dec 05 21:13:33 crc kubenswrapper[4885]: I1205 21:13:33.791598 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-lhpld_7e345d16-e7f9-4881-a031-eb5ef37e22b3/kube-rbac-proxy/0.log" Dec 05 21:13:33 crc kubenswrapper[4885]: I1205 21:13:33.898187 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-p8qxg_5275a59b-4935-4ce8-8552-ed28f0377be5/nmstate-operator/0.log" Dec 05 21:13:33 crc kubenswrapper[4885]: I1205 21:13:33.985496 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-ph4g7_7001b6ac-1126-4d81-9148-47e6f7f830c1/nmstate-webhook/0.log" Dec 05 21:13:46 crc kubenswrapper[4885]: I1205 21:13:46.631468 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:13:46 crc kubenswrapper[4885]: I1205 21:13:46.632067 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:13:48 crc kubenswrapper[4885]: I1205 21:13:48.786993 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-gwwj5_61bdac93-6e5a-4b95-a146-ea0874dc5962/kube-rbac-proxy/0.log" Dec 05 21:13:48 crc kubenswrapper[4885]: I1205 21:13:48.875942 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-gwwj5_61bdac93-6e5a-4b95-a146-ea0874dc5962/controller/0.log" Dec 05 21:13:48 crc kubenswrapper[4885]: I1205 21:13:48.981770 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-frr-files/0.log" Dec 05 21:13:49 crc kubenswrapper[4885]: I1205 21:13:49.158094 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-reloader/0.log" Dec 05 21:13:49 crc kubenswrapper[4885]: I1205 21:13:49.168335 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-frr-files/0.log" Dec 05 21:13:49 crc kubenswrapper[4885]: I1205 21:13:49.186006 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-reloader/0.log" Dec 05 21:13:49 crc kubenswrapper[4885]: I1205 21:13:49.186427 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-metrics/0.log" Dec 05 21:13:49 crc kubenswrapper[4885]: I1205 21:13:49.388408 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-frr-files/0.log" Dec 05 21:13:49 crc kubenswrapper[4885]: I1205 21:13:49.443687 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-metrics/0.log" Dec 05 21:13:49 crc kubenswrapper[4885]: I1205 21:13:49.448774 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-metrics/0.log" Dec 05 21:13:49 crc kubenswrapper[4885]: I1205 21:13:49.471625 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-reloader/0.log" Dec 05 21:13:49 crc kubenswrapper[4885]: I1205 21:13:49.613144 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-reloader/0.log" Dec 05 21:13:49 crc kubenswrapper[4885]: I1205 21:13:49.615562 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-metrics/0.log" Dec 05 21:13:49 crc kubenswrapper[4885]: I1205 21:13:49.618684 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/cp-frr-files/0.log" Dec 05 21:13:49 crc kubenswrapper[4885]: I1205 21:13:49.630888 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/controller/0.log" Dec 05 21:13:49 crc kubenswrapper[4885]: I1205 21:13:49.778398 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/kube-rbac-proxy/0.log" Dec 05 21:13:49 crc kubenswrapper[4885]: I1205 21:13:49.804765 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/kube-rbac-proxy-frr/0.log" Dec 05 21:13:49 crc kubenswrapper[4885]: I1205 21:13:49.830749 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/frr-metrics/0.log" Dec 05 21:13:49 crc kubenswrapper[4885]: I1205 21:13:49.973390 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/reloader/0.log" Dec 05 21:13:50 crc kubenswrapper[4885]: I1205 21:13:50.009829 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-p9slq_a1b920f0-0596-43ef-b94b-d3035f0e5e1c/frr-k8s-webhook-server/0.log" Dec 05 21:13:50 crc kubenswrapper[4885]: I1205 21:13:50.225357 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-fb9f8748-k8dk7_dd4c62d1-80af-4d61-bc04-6ac5c8259121/manager/0.log" Dec 05 21:13:50 crc kubenswrapper[4885]: I1205 21:13:50.380107 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5dcf889d57-wtshh_8263fedc-0c2a-4de8-8d5c-47aa32b745ee/webhook-server/0.log" Dec 05 21:13:50 crc kubenswrapper[4885]: I1205 21:13:50.425385 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5jq2d_2fa36864-508b-488b-8830-d60337213cca/kube-rbac-proxy/0.log" Dec 05 21:13:51 crc kubenswrapper[4885]: I1205 21:13:51.083758 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5jq2d_2fa36864-508b-488b-8830-d60337213cca/speaker/0.log" Dec 05 21:13:51 crc kubenswrapper[4885]: I1205 21:13:51.266537 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qkxh_f0f8b2ce-10b2-491b-9100-34835c07e175/frr/0.log" Dec 05 21:14:04 crc kubenswrapper[4885]: I1205 21:14:04.213395 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8_b4c07e66-01e1-4851-92f0-2e498a2f04bf/util/0.log" Dec 05 21:14:04 crc kubenswrapper[4885]: I1205 21:14:04.806798 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8_b4c07e66-01e1-4851-92f0-2e498a2f04bf/util/0.log" Dec 05 21:14:04 crc kubenswrapper[4885]: I1205 21:14:04.807784 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8_b4c07e66-01e1-4851-92f0-2e498a2f04bf/pull/0.log" Dec 05 21:14:04 crc kubenswrapper[4885]: I1205 21:14:04.825217 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8_b4c07e66-01e1-4851-92f0-2e498a2f04bf/pull/0.log" Dec 05 21:14:05 crc kubenswrapper[4885]: I1205 21:14:05.027529 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8_b4c07e66-01e1-4851-92f0-2e498a2f04bf/util/0.log" Dec 05 21:14:05 crc kubenswrapper[4885]: I1205 21:14:05.038568 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8_b4c07e66-01e1-4851-92f0-2e498a2f04bf/extract/0.log" Dec 05 21:14:05 crc kubenswrapper[4885]: I1205 21:14:05.039613 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fw6gv8_b4c07e66-01e1-4851-92f0-2e498a2f04bf/pull/0.log" Dec 05 21:14:05 crc kubenswrapper[4885]: I1205 21:14:05.181286 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m_2799bcd8-694a-4fdc-b243-2780761ecda7/util/0.log" Dec 05 21:14:05 crc kubenswrapper[4885]: I1205 21:14:05.363651 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m_2799bcd8-694a-4fdc-b243-2780761ecda7/pull/0.log" Dec 05 21:14:05 crc kubenswrapper[4885]: I1205 21:14:05.364713 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m_2799bcd8-694a-4fdc-b243-2780761ecda7/util/0.log" Dec 05 21:14:05 crc kubenswrapper[4885]: I1205 21:14:05.385636 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m_2799bcd8-694a-4fdc-b243-2780761ecda7/pull/0.log" Dec 05 21:14:05 crc kubenswrapper[4885]: I1205 21:14:05.751305 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m_2799bcd8-694a-4fdc-b243-2780761ecda7/pull/0.log" Dec 05 21:14:05 crc kubenswrapper[4885]: I1205 21:14:05.779642 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m_2799bcd8-694a-4fdc-b243-2780761ecda7/extract/0.log" Dec 05 21:14:05 crc kubenswrapper[4885]: I1205 21:14:05.784338 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83r6f8m_2799bcd8-694a-4fdc-b243-2780761ecda7/util/0.log" Dec 05 21:14:05 crc kubenswrapper[4885]: I1205 21:14:05.913915 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-65plm_f0443767-ff82-48a9-8fc4-c981ebe6ebac/extract-utilities/0.log" Dec 05 21:14:06 crc kubenswrapper[4885]: I1205 21:14:06.042890 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-65plm_f0443767-ff82-48a9-8fc4-c981ebe6ebac/extract-utilities/0.log" Dec 05 21:14:06 crc kubenswrapper[4885]: I1205 21:14:06.058070 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-65plm_f0443767-ff82-48a9-8fc4-c981ebe6ebac/extract-content/0.log" Dec 05 21:14:06 crc kubenswrapper[4885]: I1205 21:14:06.100800 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-65plm_f0443767-ff82-48a9-8fc4-c981ebe6ebac/extract-content/0.log" Dec 05 21:14:06 crc kubenswrapper[4885]: I1205 21:14:06.248090 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-65plm_f0443767-ff82-48a9-8fc4-c981ebe6ebac/extract-utilities/0.log" Dec 05 21:14:06 crc kubenswrapper[4885]: I1205 21:14:06.261132 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-65plm_f0443767-ff82-48a9-8fc4-c981ebe6ebac/extract-content/0.log" Dec 05 21:14:06 crc kubenswrapper[4885]: I1205 21:14:06.478763 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwl8v_f2250db3-b5b2-435f-bd9e-1b599f663d70/extract-utilities/0.log" Dec 05 21:14:06 crc kubenswrapper[4885]: I1205 21:14:06.564589 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-65plm_f0443767-ff82-48a9-8fc4-c981ebe6ebac/registry-server/0.log" Dec 05 21:14:07 crc kubenswrapper[4885]: I1205 21:14:07.104495 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwl8v_f2250db3-b5b2-435f-bd9e-1b599f663d70/extract-content/0.log" Dec 05 21:14:07 crc kubenswrapper[4885]: I1205 21:14:07.119011 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwl8v_f2250db3-b5b2-435f-bd9e-1b599f663d70/extract-content/0.log" Dec 05 21:14:07 crc kubenswrapper[4885]: I1205 21:14:07.139918 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwl8v_f2250db3-b5b2-435f-bd9e-1b599f663d70/extract-utilities/0.log" Dec 05 21:14:07 crc kubenswrapper[4885]: I1205 21:14:07.306915 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwl8v_f2250db3-b5b2-435f-bd9e-1b599f663d70/extract-utilities/0.log" Dec 05 21:14:07 crc kubenswrapper[4885]: I1205 21:14:07.309613 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwl8v_f2250db3-b5b2-435f-bd9e-1b599f663d70/extract-content/0.log" Dec 05 21:14:07 crc kubenswrapper[4885]: I1205 21:14:07.518667 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-djpjw_b7708f77-d399-4d7e-8034-9e043e56aabe/marketplace-operator/0.log" Dec 05 21:14:07 crc kubenswrapper[4885]: I1205 21:14:07.628491 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnv28_fd9a5eba-660b-489b-b9f8-3a5366d313c9/extract-utilities/0.log" Dec 05 21:14:07 crc kubenswrapper[4885]: I1205 21:14:07.802378 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnv28_fd9a5eba-660b-489b-b9f8-3a5366d313c9/extract-content/0.log" Dec 05 21:14:07 crc kubenswrapper[4885]: I1205 21:14:07.843990 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnv28_fd9a5eba-660b-489b-b9f8-3a5366d313c9/extract-content/0.log" Dec 05 21:14:07 crc kubenswrapper[4885]: I1205 21:14:07.858995 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnv28_fd9a5eba-660b-489b-b9f8-3a5366d313c9/extract-utilities/0.log" Dec 05 21:14:07 crc kubenswrapper[4885]: I1205 21:14:07.867268 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwl8v_f2250db3-b5b2-435f-bd9e-1b599f663d70/registry-server/0.log" Dec 05 21:14:08 crc kubenswrapper[4885]: I1205 21:14:08.359783 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnv28_fd9a5eba-660b-489b-b9f8-3a5366d313c9/extract-content/0.log" Dec 05 21:14:08 crc kubenswrapper[4885]: I1205 21:14:08.391351 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2wmcd_d1cc6544-7046-414f-9f36-71801abdfe03/extract-utilities/0.log" Dec 05 21:14:08 crc kubenswrapper[4885]: I1205 21:14:08.411912 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnv28_fd9a5eba-660b-489b-b9f8-3a5366d313c9/extract-utilities/0.log" Dec 05 21:14:08 crc kubenswrapper[4885]: I1205 21:14:08.543967 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnv28_fd9a5eba-660b-489b-b9f8-3a5366d313c9/registry-server/0.log" Dec 05 21:14:08 crc kubenswrapper[4885]: I1205 21:14:08.684740 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2wmcd_d1cc6544-7046-414f-9f36-71801abdfe03/extract-content/0.log" Dec 05 21:14:08 crc kubenswrapper[4885]: I1205 21:14:08.712547 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2wmcd_d1cc6544-7046-414f-9f36-71801abdfe03/extract-content/0.log" Dec 05 21:14:08 crc kubenswrapper[4885]: I1205 21:14:08.721149 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2wmcd_d1cc6544-7046-414f-9f36-71801abdfe03/extract-utilities/0.log" Dec 05 21:14:08 crc kubenswrapper[4885]: I1205 21:14:08.851623 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2wmcd_d1cc6544-7046-414f-9f36-71801abdfe03/extract-utilities/0.log" Dec 05 21:14:08 crc kubenswrapper[4885]: I1205 21:14:08.876220 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2wmcd_d1cc6544-7046-414f-9f36-71801abdfe03/extract-content/0.log" Dec 05 21:14:09 crc kubenswrapper[4885]: I1205 21:14:09.321959 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2wmcd_d1cc6544-7046-414f-9f36-71801abdfe03/registry-server/0.log" Dec 05 21:14:16 crc kubenswrapper[4885]: I1205 21:14:16.630621 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:14:16 crc kubenswrapper[4885]: I1205 21:14:16.632223 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:14:16 crc kubenswrapper[4885]: I1205 21:14:16.632356 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 21:14:16 crc kubenswrapper[4885]: I1205 21:14:16.633207 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"205f14d82d152bace567ea4b3c4f4a866de21c49423e552f7b835ff0dc2520e5"} pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:14:16 crc kubenswrapper[4885]: I1205 21:14:16.633347 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" containerID="cri-o://205f14d82d152bace567ea4b3c4f4a866de21c49423e552f7b835ff0dc2520e5" gracePeriod=600 Dec 05 21:14:17 crc kubenswrapper[4885]: I1205 21:14:17.710133 4885 generic.go:334] "Generic (PLEG): container finished" podID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerID="205f14d82d152bace567ea4b3c4f4a866de21c49423e552f7b835ff0dc2520e5" exitCode=0 Dec 05 21:14:17 crc kubenswrapper[4885]: I1205 21:14:17.710184 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerDied","Data":"205f14d82d152bace567ea4b3c4f4a866de21c49423e552f7b835ff0dc2520e5"} Dec 05 21:14:17 crc kubenswrapper[4885]: I1205 21:14:17.710581 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerStarted","Data":"f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45"} Dec 05 21:14:17 crc kubenswrapper[4885]: I1205 21:14:17.710603 4885 scope.go:117] "RemoveContainer" containerID="fd4753af3494c8643a313ec98d8e6d5d66a2556e87173df3882a1dfb5e91b847" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.202892 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm"] Dec 05 21:15:00 crc kubenswrapper[4885]: E1205 21:15:00.205587 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723133c6-d80e-4dd9-94d1-a892ed270483" containerName="container-00" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.205609 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="723133c6-d80e-4dd9-94d1-a892ed270483" containerName="container-00" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.205853 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="723133c6-d80e-4dd9-94d1-a892ed270483" containerName="container-00" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.206644 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.208864 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.214208 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm"] Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.215808 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.228722 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nr6w\" (UniqueName: \"kubernetes.io/projected/60746c7b-6fc0-4731-91b9-5d0559910066-kube-api-access-6nr6w\") pod \"collect-profiles-29416155-bpxkm\" (UID: \"60746c7b-6fc0-4731-91b9-5d0559910066\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.228796 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60746c7b-6fc0-4731-91b9-5d0559910066-secret-volume\") pod \"collect-profiles-29416155-bpxkm\" (UID: \"60746c7b-6fc0-4731-91b9-5d0559910066\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.228823 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60746c7b-6fc0-4731-91b9-5d0559910066-config-volume\") pod \"collect-profiles-29416155-bpxkm\" (UID: \"60746c7b-6fc0-4731-91b9-5d0559910066\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.329591 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nr6w\" (UniqueName: \"kubernetes.io/projected/60746c7b-6fc0-4731-91b9-5d0559910066-kube-api-access-6nr6w\") pod \"collect-profiles-29416155-bpxkm\" (UID: \"60746c7b-6fc0-4731-91b9-5d0559910066\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.329658 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60746c7b-6fc0-4731-91b9-5d0559910066-config-volume\") pod \"collect-profiles-29416155-bpxkm\" (UID: \"60746c7b-6fc0-4731-91b9-5d0559910066\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.329683 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60746c7b-6fc0-4731-91b9-5d0559910066-secret-volume\") pod \"collect-profiles-29416155-bpxkm\" (UID: \"60746c7b-6fc0-4731-91b9-5d0559910066\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.330656 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60746c7b-6fc0-4731-91b9-5d0559910066-config-volume\") pod \"collect-profiles-29416155-bpxkm\" (UID: \"60746c7b-6fc0-4731-91b9-5d0559910066\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.335315 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60746c7b-6fc0-4731-91b9-5d0559910066-secret-volume\") pod \"collect-profiles-29416155-bpxkm\" (UID: \"60746c7b-6fc0-4731-91b9-5d0559910066\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.346531 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nr6w\" (UniqueName: \"kubernetes.io/projected/60746c7b-6fc0-4731-91b9-5d0559910066-kube-api-access-6nr6w\") pod \"collect-profiles-29416155-bpxkm\" (UID: \"60746c7b-6fc0-4731-91b9-5d0559910066\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.532591 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm" Dec 05 21:15:00 crc kubenswrapper[4885]: I1205 21:15:00.998102 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm"] Dec 05 21:15:01 crc kubenswrapper[4885]: I1205 21:15:01.088952 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm" event={"ID":"60746c7b-6fc0-4731-91b9-5d0559910066","Type":"ContainerStarted","Data":"8260e85b99bd7673dfdf2f09b933f68940f289dcbe044521e30597c5e14b2d41"} Dec 05 21:15:02 crc kubenswrapper[4885]: I1205 21:15:02.103088 4885 generic.go:334] "Generic (PLEG): container finished" podID="60746c7b-6fc0-4731-91b9-5d0559910066" containerID="14ce181df835bac880a5a28c23ada87c05c7988fff3a06bcbe9c76e02c13f009" exitCode=0 Dec 05 21:15:02 crc kubenswrapper[4885]: I1205 21:15:02.103193 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm" event={"ID":"60746c7b-6fc0-4731-91b9-5d0559910066","Type":"ContainerDied","Data":"14ce181df835bac880a5a28c23ada87c05c7988fff3a06bcbe9c76e02c13f009"} Dec 05 21:15:03 crc kubenswrapper[4885]: I1205 21:15:03.466519 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm" Dec 05 21:15:03 crc kubenswrapper[4885]: I1205 21:15:03.504701 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nr6w\" (UniqueName: \"kubernetes.io/projected/60746c7b-6fc0-4731-91b9-5d0559910066-kube-api-access-6nr6w\") pod \"60746c7b-6fc0-4731-91b9-5d0559910066\" (UID: \"60746c7b-6fc0-4731-91b9-5d0559910066\") " Dec 05 21:15:03 crc kubenswrapper[4885]: I1205 21:15:03.504947 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60746c7b-6fc0-4731-91b9-5d0559910066-secret-volume\") pod \"60746c7b-6fc0-4731-91b9-5d0559910066\" (UID: \"60746c7b-6fc0-4731-91b9-5d0559910066\") " Dec 05 21:15:03 crc kubenswrapper[4885]: I1205 21:15:03.504972 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60746c7b-6fc0-4731-91b9-5d0559910066-config-volume\") pod \"60746c7b-6fc0-4731-91b9-5d0559910066\" (UID: \"60746c7b-6fc0-4731-91b9-5d0559910066\") " Dec 05 21:15:03 crc kubenswrapper[4885]: I1205 21:15:03.506753 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60746c7b-6fc0-4731-91b9-5d0559910066-config-volume" (OuterVolumeSpecName: "config-volume") pod "60746c7b-6fc0-4731-91b9-5d0559910066" (UID: "60746c7b-6fc0-4731-91b9-5d0559910066"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:15:03 crc kubenswrapper[4885]: I1205 21:15:03.510643 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60746c7b-6fc0-4731-91b9-5d0559910066-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "60746c7b-6fc0-4731-91b9-5d0559910066" (UID: "60746c7b-6fc0-4731-91b9-5d0559910066"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:15:03 crc kubenswrapper[4885]: I1205 21:15:03.511499 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60746c7b-6fc0-4731-91b9-5d0559910066-kube-api-access-6nr6w" (OuterVolumeSpecName: "kube-api-access-6nr6w") pod "60746c7b-6fc0-4731-91b9-5d0559910066" (UID: "60746c7b-6fc0-4731-91b9-5d0559910066"). InnerVolumeSpecName "kube-api-access-6nr6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:15:03 crc kubenswrapper[4885]: I1205 21:15:03.606728 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60746c7b-6fc0-4731-91b9-5d0559910066-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:15:03 crc kubenswrapper[4885]: I1205 21:15:03.606768 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60746c7b-6fc0-4731-91b9-5d0559910066-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:15:03 crc kubenswrapper[4885]: I1205 21:15:03.606780 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nr6w\" (UniqueName: \"kubernetes.io/projected/60746c7b-6fc0-4731-91b9-5d0559910066-kube-api-access-6nr6w\") on node \"crc\" DevicePath \"\"" Dec 05 21:15:04 crc kubenswrapper[4885]: I1205 21:15:04.120436 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm" event={"ID":"60746c7b-6fc0-4731-91b9-5d0559910066","Type":"ContainerDied","Data":"8260e85b99bd7673dfdf2f09b933f68940f289dcbe044521e30597c5e14b2d41"} Dec 05 21:15:04 crc kubenswrapper[4885]: I1205 21:15:04.120721 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8260e85b99bd7673dfdf2f09b933f68940f289dcbe044521e30597c5e14b2d41" Dec 05 21:15:04 crc kubenswrapper[4885]: I1205 21:15:04.120492 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-bpxkm" Dec 05 21:15:04 crc kubenswrapper[4885]: I1205 21:15:04.557947 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt"] Dec 05 21:15:04 crc kubenswrapper[4885]: I1205 21:15:04.570374 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-ww4gt"] Dec 05 21:15:05 crc kubenswrapper[4885]: I1205 21:15:05.184663 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5d81118-a04d-40a2-bfbc-8cdcb5e0b301" path="/var/lib/kubelet/pods/d5d81118-a04d-40a2-bfbc-8cdcb5e0b301/volumes" Dec 05 21:15:48 crc kubenswrapper[4885]: I1205 21:15:48.547391 4885 generic.go:334] "Generic (PLEG): container finished" podID="953be525-991e-40cc-9321-2f8065e030ef" containerID="23de8953ab59e495bd2fbef978b61c079c5a0757015152c606f4a3ce17ddb65d" exitCode=0 Dec 05 21:15:48 crc kubenswrapper[4885]: I1205 21:15:48.547486 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gc5r4/must-gather-phlff" event={"ID":"953be525-991e-40cc-9321-2f8065e030ef","Type":"ContainerDied","Data":"23de8953ab59e495bd2fbef978b61c079c5a0757015152c606f4a3ce17ddb65d"} Dec 05 21:15:48 crc kubenswrapper[4885]: I1205 21:15:48.548657 4885 scope.go:117] "RemoveContainer" containerID="23de8953ab59e495bd2fbef978b61c079c5a0757015152c606f4a3ce17ddb65d" Dec 05 21:15:48 crc kubenswrapper[4885]: I1205 21:15:48.920958 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gc5r4_must-gather-phlff_953be525-991e-40cc-9321-2f8065e030ef/gather/0.log" Dec 05 21:15:56 crc kubenswrapper[4885]: I1205 21:15:56.745297 4885 scope.go:117] "RemoveContainer" containerID="0e99a558ebe9c458cbc59431f2536417d392456353f9d882bed807862382f5ec" Dec 05 21:15:58 crc kubenswrapper[4885]: I1205 21:15:58.871866 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gc5r4/must-gather-phlff"] Dec 05 21:15:58 crc kubenswrapper[4885]: I1205 21:15:58.873786 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-gc5r4/must-gather-phlff" podUID="953be525-991e-40cc-9321-2f8065e030ef" containerName="copy" containerID="cri-o://9207e858c9fe8b1a77fd623613f666be446aaa9a0adb61716fdcbf72bddb978a" gracePeriod=2 Dec 05 21:15:58 crc kubenswrapper[4885]: I1205 21:15:58.890607 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gc5r4/must-gather-phlff"] Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.313132 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gc5r4_must-gather-phlff_953be525-991e-40cc-9321-2f8065e030ef/copy/0.log" Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.313847 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc5r4/must-gather-phlff" Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.386693 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/953be525-991e-40cc-9321-2f8065e030ef-must-gather-output\") pod \"953be525-991e-40cc-9321-2f8065e030ef\" (UID: \"953be525-991e-40cc-9321-2f8065e030ef\") " Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.400985 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvppm\" (UniqueName: \"kubernetes.io/projected/953be525-991e-40cc-9321-2f8065e030ef-kube-api-access-kvppm\") pod \"953be525-991e-40cc-9321-2f8065e030ef\" (UID: \"953be525-991e-40cc-9321-2f8065e030ef\") " Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.408137 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/953be525-991e-40cc-9321-2f8065e030ef-kube-api-access-kvppm" (OuterVolumeSpecName: "kube-api-access-kvppm") pod "953be525-991e-40cc-9321-2f8065e030ef" (UID: "953be525-991e-40cc-9321-2f8065e030ef"). InnerVolumeSpecName "kube-api-access-kvppm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.504527 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvppm\" (UniqueName: \"kubernetes.io/projected/953be525-991e-40cc-9321-2f8065e030ef-kube-api-access-kvppm\") on node \"crc\" DevicePath \"\"" Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.555667 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/953be525-991e-40cc-9321-2f8065e030ef-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "953be525-991e-40cc-9321-2f8065e030ef" (UID: "953be525-991e-40cc-9321-2f8065e030ef"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.606998 4885 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/953be525-991e-40cc-9321-2f8065e030ef-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.647556 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gc5r4_must-gather-phlff_953be525-991e-40cc-9321-2f8065e030ef/copy/0.log" Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.647973 4885 generic.go:334] "Generic (PLEG): container finished" podID="953be525-991e-40cc-9321-2f8065e030ef" containerID="9207e858c9fe8b1a77fd623613f666be446aaa9a0adb61716fdcbf72bddb978a" exitCode=143 Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.648052 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gc5r4/must-gather-phlff" Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.648073 4885 scope.go:117] "RemoveContainer" containerID="9207e858c9fe8b1a77fd623613f666be446aaa9a0adb61716fdcbf72bddb978a" Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.705431 4885 scope.go:117] "RemoveContainer" containerID="23de8953ab59e495bd2fbef978b61c079c5a0757015152c606f4a3ce17ddb65d" Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.878112 4885 scope.go:117] "RemoveContainer" containerID="9207e858c9fe8b1a77fd623613f666be446aaa9a0adb61716fdcbf72bddb978a" Dec 05 21:15:59 crc kubenswrapper[4885]: E1205 21:15:59.878579 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9207e858c9fe8b1a77fd623613f666be446aaa9a0adb61716fdcbf72bddb978a\": container with ID starting with 9207e858c9fe8b1a77fd623613f666be446aaa9a0adb61716fdcbf72bddb978a not found: ID does not exist" containerID="9207e858c9fe8b1a77fd623613f666be446aaa9a0adb61716fdcbf72bddb978a" Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.878628 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9207e858c9fe8b1a77fd623613f666be446aaa9a0adb61716fdcbf72bddb978a"} err="failed to get container status \"9207e858c9fe8b1a77fd623613f666be446aaa9a0adb61716fdcbf72bddb978a\": rpc error: code = NotFound desc = could not find container \"9207e858c9fe8b1a77fd623613f666be446aaa9a0adb61716fdcbf72bddb978a\": container with ID starting with 9207e858c9fe8b1a77fd623613f666be446aaa9a0adb61716fdcbf72bddb978a not found: ID does not exist" Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.878653 4885 scope.go:117] "RemoveContainer" containerID="23de8953ab59e495bd2fbef978b61c079c5a0757015152c606f4a3ce17ddb65d" Dec 05 21:15:59 crc kubenswrapper[4885]: E1205 21:15:59.878912 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23de8953ab59e495bd2fbef978b61c079c5a0757015152c606f4a3ce17ddb65d\": container with ID starting with 23de8953ab59e495bd2fbef978b61c079c5a0757015152c606f4a3ce17ddb65d not found: ID does not exist" containerID="23de8953ab59e495bd2fbef978b61c079c5a0757015152c606f4a3ce17ddb65d" Dec 05 21:15:59 crc kubenswrapper[4885]: I1205 21:15:59.878933 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23de8953ab59e495bd2fbef978b61c079c5a0757015152c606f4a3ce17ddb65d"} err="failed to get container status \"23de8953ab59e495bd2fbef978b61c079c5a0757015152c606f4a3ce17ddb65d\": rpc error: code = NotFound desc = could not find container \"23de8953ab59e495bd2fbef978b61c079c5a0757015152c606f4a3ce17ddb65d\": container with ID starting with 23de8953ab59e495bd2fbef978b61c079c5a0757015152c606f4a3ce17ddb65d not found: ID does not exist" Dec 05 21:16:01 crc kubenswrapper[4885]: I1205 21:16:01.194150 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="953be525-991e-40cc-9321-2f8065e030ef" path="/var/lib/kubelet/pods/953be525-991e-40cc-9321-2f8065e030ef/volumes" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.529618 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xbmvk"] Dec 05 21:16:08 crc kubenswrapper[4885]: E1205 21:16:08.530787 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60746c7b-6fc0-4731-91b9-5d0559910066" containerName="collect-profiles" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.530806 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="60746c7b-6fc0-4731-91b9-5d0559910066" containerName="collect-profiles" Dec 05 21:16:08 crc kubenswrapper[4885]: E1205 21:16:08.530864 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="953be525-991e-40cc-9321-2f8065e030ef" containerName="gather" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.530875 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="953be525-991e-40cc-9321-2f8065e030ef" containerName="gather" Dec 05 21:16:08 crc kubenswrapper[4885]: E1205 21:16:08.530927 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="953be525-991e-40cc-9321-2f8065e030ef" containerName="copy" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.530937 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="953be525-991e-40cc-9321-2f8065e030ef" containerName="copy" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.531279 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="953be525-991e-40cc-9321-2f8065e030ef" containerName="gather" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.531303 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="953be525-991e-40cc-9321-2f8065e030ef" containerName="copy" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.531351 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="60746c7b-6fc0-4731-91b9-5d0559910066" containerName="collect-profiles" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.533361 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.572136 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xbmvk"] Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.686706 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r69fr\" (UniqueName: \"kubernetes.io/projected/61556d6f-0ff7-4550-a652-22b796f13378-kube-api-access-r69fr\") pod \"redhat-operators-xbmvk\" (UID: \"61556d6f-0ff7-4550-a652-22b796f13378\") " pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.686801 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61556d6f-0ff7-4550-a652-22b796f13378-utilities\") pod \"redhat-operators-xbmvk\" (UID: \"61556d6f-0ff7-4550-a652-22b796f13378\") " pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.686855 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61556d6f-0ff7-4550-a652-22b796f13378-catalog-content\") pod \"redhat-operators-xbmvk\" (UID: \"61556d6f-0ff7-4550-a652-22b796f13378\") " pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.789456 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61556d6f-0ff7-4550-a652-22b796f13378-catalog-content\") pod \"redhat-operators-xbmvk\" (UID: \"61556d6f-0ff7-4550-a652-22b796f13378\") " pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.789985 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r69fr\" (UniqueName: \"kubernetes.io/projected/61556d6f-0ff7-4550-a652-22b796f13378-kube-api-access-r69fr\") pod \"redhat-operators-xbmvk\" (UID: \"61556d6f-0ff7-4550-a652-22b796f13378\") " pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.790133 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61556d6f-0ff7-4550-a652-22b796f13378-catalog-content\") pod \"redhat-operators-xbmvk\" (UID: \"61556d6f-0ff7-4550-a652-22b796f13378\") " pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.790159 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61556d6f-0ff7-4550-a652-22b796f13378-utilities\") pod \"redhat-operators-xbmvk\" (UID: \"61556d6f-0ff7-4550-a652-22b796f13378\") " pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.790666 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61556d6f-0ff7-4550-a652-22b796f13378-utilities\") pod \"redhat-operators-xbmvk\" (UID: \"61556d6f-0ff7-4550-a652-22b796f13378\") " pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.809167 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r69fr\" (UniqueName: \"kubernetes.io/projected/61556d6f-0ff7-4550-a652-22b796f13378-kube-api-access-r69fr\") pod \"redhat-operators-xbmvk\" (UID: \"61556d6f-0ff7-4550-a652-22b796f13378\") " pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:08 crc kubenswrapper[4885]: I1205 21:16:08.887711 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:09 crc kubenswrapper[4885]: I1205 21:16:09.359467 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xbmvk"] Dec 05 21:16:09 crc kubenswrapper[4885]: I1205 21:16:09.739509 4885 generic.go:334] "Generic (PLEG): container finished" podID="61556d6f-0ff7-4550-a652-22b796f13378" containerID="cb275480ccf853cd4f7e729fff28314aa3b8208b38c571193481b6ae40170375" exitCode=0 Dec 05 21:16:09 crc kubenswrapper[4885]: I1205 21:16:09.739564 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbmvk" event={"ID":"61556d6f-0ff7-4550-a652-22b796f13378","Type":"ContainerDied","Data":"cb275480ccf853cd4f7e729fff28314aa3b8208b38c571193481b6ae40170375"} Dec 05 21:16:09 crc kubenswrapper[4885]: I1205 21:16:09.739595 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbmvk" event={"ID":"61556d6f-0ff7-4550-a652-22b796f13378","Type":"ContainerStarted","Data":"e0f4947a43a8a337562a31400fc1c7e62949b0a28d942bcfded7f67d49c5a307"} Dec 05 21:16:09 crc kubenswrapper[4885]: I1205 21:16:09.741492 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:16:10 crc kubenswrapper[4885]: I1205 21:16:10.749139 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbmvk" event={"ID":"61556d6f-0ff7-4550-a652-22b796f13378","Type":"ContainerStarted","Data":"31cad30e2ded9c8270f84eba82d2027dcb387ee06d6346b59ad9c336677ebe96"} Dec 05 21:16:11 crc kubenswrapper[4885]: I1205 21:16:11.775796 4885 generic.go:334] "Generic (PLEG): container finished" podID="61556d6f-0ff7-4550-a652-22b796f13378" containerID="31cad30e2ded9c8270f84eba82d2027dcb387ee06d6346b59ad9c336677ebe96" exitCode=0 Dec 05 21:16:11 crc kubenswrapper[4885]: I1205 21:16:11.776202 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbmvk" event={"ID":"61556d6f-0ff7-4550-a652-22b796f13378","Type":"ContainerDied","Data":"31cad30e2ded9c8270f84eba82d2027dcb387ee06d6346b59ad9c336677ebe96"} Dec 05 21:16:12 crc kubenswrapper[4885]: I1205 21:16:12.786107 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbmvk" event={"ID":"61556d6f-0ff7-4550-a652-22b796f13378","Type":"ContainerStarted","Data":"fd1c8f22123ed5f1f4aaff462e955f58e74885f9b6bf749d28f632465ae9a77c"} Dec 05 21:16:12 crc kubenswrapper[4885]: I1205 21:16:12.805716 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xbmvk" podStartSLOduration=2.331927988 podStartE2EDuration="4.805695447s" podCreationTimestamp="2025-12-05 21:16:08 +0000 UTC" firstStartedPulling="2025-12-05 21:16:09.741194994 +0000 UTC m=+4235.038010655" lastFinishedPulling="2025-12-05 21:16:12.214962453 +0000 UTC m=+4237.511778114" observedRunningTime="2025-12-05 21:16:12.803102886 +0000 UTC m=+4238.099918547" watchObservedRunningTime="2025-12-05 21:16:12.805695447 +0000 UTC m=+4238.102511128" Dec 05 21:16:18 crc kubenswrapper[4885]: I1205 21:16:18.888326 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:18 crc kubenswrapper[4885]: I1205 21:16:18.889005 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:18 crc kubenswrapper[4885]: I1205 21:16:18.957668 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:19 crc kubenswrapper[4885]: I1205 21:16:19.907189 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:22 crc kubenswrapper[4885]: I1205 21:16:22.515121 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xbmvk"] Dec 05 21:16:22 crc kubenswrapper[4885]: I1205 21:16:22.515919 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xbmvk" podUID="61556d6f-0ff7-4550-a652-22b796f13378" containerName="registry-server" containerID="cri-o://fd1c8f22123ed5f1f4aaff462e955f58e74885f9b6bf749d28f632465ae9a77c" gracePeriod=2 Dec 05 21:16:23 crc kubenswrapper[4885]: I1205 21:16:23.908511 4885 generic.go:334] "Generic (PLEG): container finished" podID="61556d6f-0ff7-4550-a652-22b796f13378" containerID="fd1c8f22123ed5f1f4aaff462e955f58e74885f9b6bf749d28f632465ae9a77c" exitCode=0 Dec 05 21:16:23 crc kubenswrapper[4885]: I1205 21:16:23.908628 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbmvk" event={"ID":"61556d6f-0ff7-4550-a652-22b796f13378","Type":"ContainerDied","Data":"fd1c8f22123ed5f1f4aaff462e955f58e74885f9b6bf749d28f632465ae9a77c"} Dec 05 21:16:24 crc kubenswrapper[4885]: I1205 21:16:24.123480 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:24 crc kubenswrapper[4885]: I1205 21:16:24.283657 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r69fr\" (UniqueName: \"kubernetes.io/projected/61556d6f-0ff7-4550-a652-22b796f13378-kube-api-access-r69fr\") pod \"61556d6f-0ff7-4550-a652-22b796f13378\" (UID: \"61556d6f-0ff7-4550-a652-22b796f13378\") " Dec 05 21:16:24 crc kubenswrapper[4885]: I1205 21:16:24.283816 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61556d6f-0ff7-4550-a652-22b796f13378-utilities\") pod \"61556d6f-0ff7-4550-a652-22b796f13378\" (UID: \"61556d6f-0ff7-4550-a652-22b796f13378\") " Dec 05 21:16:24 crc kubenswrapper[4885]: I1205 21:16:24.284003 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61556d6f-0ff7-4550-a652-22b796f13378-catalog-content\") pod \"61556d6f-0ff7-4550-a652-22b796f13378\" (UID: \"61556d6f-0ff7-4550-a652-22b796f13378\") " Dec 05 21:16:24 crc kubenswrapper[4885]: I1205 21:16:24.285396 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61556d6f-0ff7-4550-a652-22b796f13378-utilities" (OuterVolumeSpecName: "utilities") pod "61556d6f-0ff7-4550-a652-22b796f13378" (UID: "61556d6f-0ff7-4550-a652-22b796f13378"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:16:24 crc kubenswrapper[4885]: I1205 21:16:24.296287 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61556d6f-0ff7-4550-a652-22b796f13378-kube-api-access-r69fr" (OuterVolumeSpecName: "kube-api-access-r69fr") pod "61556d6f-0ff7-4550-a652-22b796f13378" (UID: "61556d6f-0ff7-4550-a652-22b796f13378"). InnerVolumeSpecName "kube-api-access-r69fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:16:24 crc kubenswrapper[4885]: I1205 21:16:24.387781 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r69fr\" (UniqueName: \"kubernetes.io/projected/61556d6f-0ff7-4550-a652-22b796f13378-kube-api-access-r69fr\") on node \"crc\" DevicePath \"\"" Dec 05 21:16:24 crc kubenswrapper[4885]: I1205 21:16:24.387831 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61556d6f-0ff7-4550-a652-22b796f13378-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:16:24 crc kubenswrapper[4885]: I1205 21:16:24.390552 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61556d6f-0ff7-4550-a652-22b796f13378-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61556d6f-0ff7-4550-a652-22b796f13378" (UID: "61556d6f-0ff7-4550-a652-22b796f13378"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:16:24 crc kubenswrapper[4885]: I1205 21:16:24.490263 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61556d6f-0ff7-4550-a652-22b796f13378-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:16:24 crc kubenswrapper[4885]: I1205 21:16:24.926432 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbmvk" event={"ID":"61556d6f-0ff7-4550-a652-22b796f13378","Type":"ContainerDied","Data":"e0f4947a43a8a337562a31400fc1c7e62949b0a28d942bcfded7f67d49c5a307"} Dec 05 21:16:24 crc kubenswrapper[4885]: I1205 21:16:24.926889 4885 scope.go:117] "RemoveContainer" containerID="fd1c8f22123ed5f1f4aaff462e955f58e74885f9b6bf749d28f632465ae9a77c" Dec 05 21:16:24 crc kubenswrapper[4885]: I1205 21:16:24.926551 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbmvk" Dec 05 21:16:24 crc kubenswrapper[4885]: I1205 21:16:24.959326 4885 scope.go:117] "RemoveContainer" containerID="31cad30e2ded9c8270f84eba82d2027dcb387ee06d6346b59ad9c336677ebe96" Dec 05 21:16:24 crc kubenswrapper[4885]: I1205 21:16:24.986336 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xbmvk"] Dec 05 21:16:25 crc kubenswrapper[4885]: I1205 21:16:25.003994 4885 scope.go:117] "RemoveContainer" containerID="cb275480ccf853cd4f7e729fff28314aa3b8208b38c571193481b6ae40170375" Dec 05 21:16:25 crc kubenswrapper[4885]: I1205 21:16:25.021694 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xbmvk"] Dec 05 21:16:25 crc kubenswrapper[4885]: I1205 21:16:25.190527 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61556d6f-0ff7-4550-a652-22b796f13378" path="/var/lib/kubelet/pods/61556d6f-0ff7-4550-a652-22b796f13378/volumes" Dec 05 21:16:46 crc kubenswrapper[4885]: I1205 21:16:46.630618 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:16:46 crc kubenswrapper[4885]: I1205 21:16:46.631168 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:17:16 crc kubenswrapper[4885]: I1205 21:17:16.630850 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:17:16 crc kubenswrapper[4885]: I1205 21:17:16.632181 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:17:46 crc kubenswrapper[4885]: I1205 21:17:46.631468 4885 patch_prober.go:28] interesting pod/machine-config-daemon-5m8lc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:17:46 crc kubenswrapper[4885]: I1205 21:17:46.632254 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:17:46 crc kubenswrapper[4885]: I1205 21:17:46.632356 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" Dec 05 21:17:46 crc kubenswrapper[4885]: I1205 21:17:46.633575 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45"} pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:17:46 crc kubenswrapper[4885]: I1205 21:17:46.633697 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerName="machine-config-daemon" containerID="cri-o://f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45" gracePeriod=600 Dec 05 21:17:46 crc kubenswrapper[4885]: E1205 21:17:46.764151 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:17:47 crc kubenswrapper[4885]: I1205 21:17:47.744594 4885 generic.go:334] "Generic (PLEG): container finished" podID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" containerID="f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45" exitCode=0 Dec 05 21:17:47 crc kubenswrapper[4885]: I1205 21:17:47.744835 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" event={"ID":"21ee2046-c3c1-4501-abe5-0ac10ddfeaf1","Type":"ContainerDied","Data":"f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45"} Dec 05 21:17:47 crc kubenswrapper[4885]: I1205 21:17:47.745051 4885 scope.go:117] "RemoveContainer" containerID="205f14d82d152bace567ea4b3c4f4a866de21c49423e552f7b835ff0dc2520e5" Dec 05 21:17:47 crc kubenswrapper[4885]: I1205 21:17:47.746957 4885 scope.go:117] "RemoveContainer" containerID="f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45" Dec 05 21:17:47 crc kubenswrapper[4885]: E1205 21:17:47.747470 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:17:56 crc kubenswrapper[4885]: I1205 21:17:56.870646 4885 scope.go:117] "RemoveContainer" containerID="fe6dffa1955a9b36b5efe9259644573f798f4ad07ff2268d17fcc9ae17f960ca" Dec 05 21:18:00 crc kubenswrapper[4885]: I1205 21:18:00.173596 4885 scope.go:117] "RemoveContainer" containerID="f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45" Dec 05 21:18:00 crc kubenswrapper[4885]: E1205 21:18:00.174967 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:18:15 crc kubenswrapper[4885]: I1205 21:18:15.182477 4885 scope.go:117] "RemoveContainer" containerID="f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45" Dec 05 21:18:15 crc kubenswrapper[4885]: E1205 21:18:15.183573 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:18:30 crc kubenswrapper[4885]: I1205 21:18:30.173220 4885 scope.go:117] "RemoveContainer" containerID="f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45" Dec 05 21:18:30 crc kubenswrapper[4885]: E1205 21:18:30.174198 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:18:41 crc kubenswrapper[4885]: I1205 21:18:41.174471 4885 scope.go:117] "RemoveContainer" containerID="f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45" Dec 05 21:18:41 crc kubenswrapper[4885]: E1205 21:18:41.175418 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:18:52 crc kubenswrapper[4885]: I1205 21:18:52.173864 4885 scope.go:117] "RemoveContainer" containerID="f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45" Dec 05 21:18:52 crc kubenswrapper[4885]: E1205 21:18:52.175141 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:19:05 crc kubenswrapper[4885]: I1205 21:19:05.547297 4885 scope.go:117] "RemoveContainer" containerID="f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45" Dec 05 21:19:05 crc kubenswrapper[4885]: E1205 21:19:05.547915 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:19:09 crc kubenswrapper[4885]: I1205 21:19:09.750475 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gmxmx"] Dec 05 21:19:09 crc kubenswrapper[4885]: E1205 21:19:09.751880 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61556d6f-0ff7-4550-a652-22b796f13378" containerName="extract-utilities" Dec 05 21:19:09 crc kubenswrapper[4885]: I1205 21:19:09.751893 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="61556d6f-0ff7-4550-a652-22b796f13378" containerName="extract-utilities" Dec 05 21:19:09 crc kubenswrapper[4885]: E1205 21:19:09.751905 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61556d6f-0ff7-4550-a652-22b796f13378" containerName="extract-content" Dec 05 21:19:09 crc kubenswrapper[4885]: I1205 21:19:09.751913 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="61556d6f-0ff7-4550-a652-22b796f13378" containerName="extract-content" Dec 05 21:19:09 crc kubenswrapper[4885]: E1205 21:19:09.751923 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61556d6f-0ff7-4550-a652-22b796f13378" containerName="registry-server" Dec 05 21:19:09 crc kubenswrapper[4885]: I1205 21:19:09.751929 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="61556d6f-0ff7-4550-a652-22b796f13378" containerName="registry-server" Dec 05 21:19:09 crc kubenswrapper[4885]: I1205 21:19:09.752139 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="61556d6f-0ff7-4550-a652-22b796f13378" containerName="registry-server" Dec 05 21:19:09 crc kubenswrapper[4885]: I1205 21:19:09.753521 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:09 crc kubenswrapper[4885]: I1205 21:19:09.788108 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gmxmx"] Dec 05 21:19:09 crc kubenswrapper[4885]: I1205 21:19:09.806321 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a166417a-6e06-46a6-ad7d-99d18d0f04d0-utilities\") pod \"community-operators-gmxmx\" (UID: \"a166417a-6e06-46a6-ad7d-99d18d0f04d0\") " pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:09 crc kubenswrapper[4885]: I1205 21:19:09.806438 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr2jv\" (UniqueName: \"kubernetes.io/projected/a166417a-6e06-46a6-ad7d-99d18d0f04d0-kube-api-access-xr2jv\") pod \"community-operators-gmxmx\" (UID: \"a166417a-6e06-46a6-ad7d-99d18d0f04d0\") " pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:09 crc kubenswrapper[4885]: I1205 21:19:09.806462 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a166417a-6e06-46a6-ad7d-99d18d0f04d0-catalog-content\") pod \"community-operators-gmxmx\" (UID: \"a166417a-6e06-46a6-ad7d-99d18d0f04d0\") " pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:09 crc kubenswrapper[4885]: I1205 21:19:09.908302 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a166417a-6e06-46a6-ad7d-99d18d0f04d0-utilities\") pod \"community-operators-gmxmx\" (UID: \"a166417a-6e06-46a6-ad7d-99d18d0f04d0\") " pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:09 crc kubenswrapper[4885]: I1205 21:19:09.908417 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr2jv\" (UniqueName: \"kubernetes.io/projected/a166417a-6e06-46a6-ad7d-99d18d0f04d0-kube-api-access-xr2jv\") pod \"community-operators-gmxmx\" (UID: \"a166417a-6e06-46a6-ad7d-99d18d0f04d0\") " pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:09 crc kubenswrapper[4885]: I1205 21:19:09.908503 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a166417a-6e06-46a6-ad7d-99d18d0f04d0-catalog-content\") pod \"community-operators-gmxmx\" (UID: \"a166417a-6e06-46a6-ad7d-99d18d0f04d0\") " pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:09 crc kubenswrapper[4885]: I1205 21:19:09.908991 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a166417a-6e06-46a6-ad7d-99d18d0f04d0-catalog-content\") pod \"community-operators-gmxmx\" (UID: \"a166417a-6e06-46a6-ad7d-99d18d0f04d0\") " pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:09 crc kubenswrapper[4885]: I1205 21:19:09.908995 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a166417a-6e06-46a6-ad7d-99d18d0f04d0-utilities\") pod \"community-operators-gmxmx\" (UID: \"a166417a-6e06-46a6-ad7d-99d18d0f04d0\") " pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:09 crc kubenswrapper[4885]: I1205 21:19:09.930278 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr2jv\" (UniqueName: \"kubernetes.io/projected/a166417a-6e06-46a6-ad7d-99d18d0f04d0-kube-api-access-xr2jv\") pod \"community-operators-gmxmx\" (UID: \"a166417a-6e06-46a6-ad7d-99d18d0f04d0\") " pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:10 crc kubenswrapper[4885]: I1205 21:19:10.090052 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:10 crc kubenswrapper[4885]: I1205 21:19:10.702769 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gmxmx"] Dec 05 21:19:10 crc kubenswrapper[4885]: W1205 21:19:10.707046 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda166417a_6e06_46a6_ad7d_99d18d0f04d0.slice/crio-953dbc16ac1cf06bac7607a0f53356ca9a314747d16f3f28242bf76ceed8915c WatchSource:0}: Error finding container 953dbc16ac1cf06bac7607a0f53356ca9a314747d16f3f28242bf76ceed8915c: Status 404 returned error can't find the container with id 953dbc16ac1cf06bac7607a0f53356ca9a314747d16f3f28242bf76ceed8915c Dec 05 21:19:11 crc kubenswrapper[4885]: I1205 21:19:11.645280 4885 generic.go:334] "Generic (PLEG): container finished" podID="a166417a-6e06-46a6-ad7d-99d18d0f04d0" containerID="562023a648a70149fc58afe628f094f92f683c613ab725b9e4c1f564ece2ec4d" exitCode=0 Dec 05 21:19:11 crc kubenswrapper[4885]: I1205 21:19:11.645757 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmxmx" event={"ID":"a166417a-6e06-46a6-ad7d-99d18d0f04d0","Type":"ContainerDied","Data":"562023a648a70149fc58afe628f094f92f683c613ab725b9e4c1f564ece2ec4d"} Dec 05 21:19:11 crc kubenswrapper[4885]: I1205 21:19:11.646131 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmxmx" event={"ID":"a166417a-6e06-46a6-ad7d-99d18d0f04d0","Type":"ContainerStarted","Data":"953dbc16ac1cf06bac7607a0f53356ca9a314747d16f3f28242bf76ceed8915c"} Dec 05 21:19:12 crc kubenswrapper[4885]: I1205 21:19:12.656365 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmxmx" event={"ID":"a166417a-6e06-46a6-ad7d-99d18d0f04d0","Type":"ContainerStarted","Data":"0c52f7d9f0d90d468c4bada184b6c712b10a1d9afed5a7673599aa6b47ac2ff7"} Dec 05 21:19:13 crc kubenswrapper[4885]: I1205 21:19:13.667703 4885 generic.go:334] "Generic (PLEG): container finished" podID="a166417a-6e06-46a6-ad7d-99d18d0f04d0" containerID="0c52f7d9f0d90d468c4bada184b6c712b10a1d9afed5a7673599aa6b47ac2ff7" exitCode=0 Dec 05 21:19:13 crc kubenswrapper[4885]: I1205 21:19:13.667748 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmxmx" event={"ID":"a166417a-6e06-46a6-ad7d-99d18d0f04d0","Type":"ContainerDied","Data":"0c52f7d9f0d90d468c4bada184b6c712b10a1d9afed5a7673599aa6b47ac2ff7"} Dec 05 21:19:14 crc kubenswrapper[4885]: I1205 21:19:14.677911 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmxmx" event={"ID":"a166417a-6e06-46a6-ad7d-99d18d0f04d0","Type":"ContainerStarted","Data":"3978e728f951170af56a929106b863cc41e009620894a42176808d0398796825"} Dec 05 21:19:14 crc kubenswrapper[4885]: I1205 21:19:14.702513 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gmxmx" podStartSLOduration=3.251483726 podStartE2EDuration="5.702487545s" podCreationTimestamp="2025-12-05 21:19:09 +0000 UTC" firstStartedPulling="2025-12-05 21:19:11.647456357 +0000 UTC m=+4416.944272018" lastFinishedPulling="2025-12-05 21:19:14.098460166 +0000 UTC m=+4419.395275837" observedRunningTime="2025-12-05 21:19:14.695515117 +0000 UTC m=+4419.992330808" watchObservedRunningTime="2025-12-05 21:19:14.702487545 +0000 UTC m=+4419.999303246" Dec 05 21:19:19 crc kubenswrapper[4885]: I1205 21:19:19.173679 4885 scope.go:117] "RemoveContainer" containerID="f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45" Dec 05 21:19:19 crc kubenswrapper[4885]: E1205 21:19:19.175904 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:19:20 crc kubenswrapper[4885]: I1205 21:19:20.092068 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:20 crc kubenswrapper[4885]: I1205 21:19:20.092546 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:20 crc kubenswrapper[4885]: I1205 21:19:20.149630 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:20 crc kubenswrapper[4885]: I1205 21:19:20.802935 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:20 crc kubenswrapper[4885]: I1205 21:19:20.858164 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gmxmx"] Dec 05 21:19:22 crc kubenswrapper[4885]: I1205 21:19:22.769809 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gmxmx" podUID="a166417a-6e06-46a6-ad7d-99d18d0f04d0" containerName="registry-server" containerID="cri-o://3978e728f951170af56a929106b863cc41e009620894a42176808d0398796825" gracePeriod=2 Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.757944 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.782788 4885 generic.go:334] "Generic (PLEG): container finished" podID="a166417a-6e06-46a6-ad7d-99d18d0f04d0" containerID="3978e728f951170af56a929106b863cc41e009620894a42176808d0398796825" exitCode=0 Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.782828 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmxmx" event={"ID":"a166417a-6e06-46a6-ad7d-99d18d0f04d0","Type":"ContainerDied","Data":"3978e728f951170af56a929106b863cc41e009620894a42176808d0398796825"} Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.782834 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmxmx" Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.782855 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmxmx" event={"ID":"a166417a-6e06-46a6-ad7d-99d18d0f04d0","Type":"ContainerDied","Data":"953dbc16ac1cf06bac7607a0f53356ca9a314747d16f3f28242bf76ceed8915c"} Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.782875 4885 scope.go:117] "RemoveContainer" containerID="3978e728f951170af56a929106b863cc41e009620894a42176808d0398796825" Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.819465 4885 scope.go:117] "RemoveContainer" containerID="0c52f7d9f0d90d468c4bada184b6c712b10a1d9afed5a7673599aa6b47ac2ff7" Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.849361 4885 scope.go:117] "RemoveContainer" containerID="562023a648a70149fc58afe628f094f92f683c613ab725b9e4c1f564ece2ec4d" Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.888917 4885 scope.go:117] "RemoveContainer" containerID="3978e728f951170af56a929106b863cc41e009620894a42176808d0398796825" Dec 05 21:19:23 crc kubenswrapper[4885]: E1205 21:19:23.889635 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3978e728f951170af56a929106b863cc41e009620894a42176808d0398796825\": container with ID starting with 3978e728f951170af56a929106b863cc41e009620894a42176808d0398796825 not found: ID does not exist" containerID="3978e728f951170af56a929106b863cc41e009620894a42176808d0398796825" Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.889670 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3978e728f951170af56a929106b863cc41e009620894a42176808d0398796825"} err="failed to get container status \"3978e728f951170af56a929106b863cc41e009620894a42176808d0398796825\": rpc error: code = NotFound desc = could not find container \"3978e728f951170af56a929106b863cc41e009620894a42176808d0398796825\": container with ID starting with 3978e728f951170af56a929106b863cc41e009620894a42176808d0398796825 not found: ID does not exist" Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.889689 4885 scope.go:117] "RemoveContainer" containerID="0c52f7d9f0d90d468c4bada184b6c712b10a1d9afed5a7673599aa6b47ac2ff7" Dec 05 21:19:23 crc kubenswrapper[4885]: E1205 21:19:23.890123 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c52f7d9f0d90d468c4bada184b6c712b10a1d9afed5a7673599aa6b47ac2ff7\": container with ID starting with 0c52f7d9f0d90d468c4bada184b6c712b10a1d9afed5a7673599aa6b47ac2ff7 not found: ID does not exist" containerID="0c52f7d9f0d90d468c4bada184b6c712b10a1d9afed5a7673599aa6b47ac2ff7" Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.890199 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c52f7d9f0d90d468c4bada184b6c712b10a1d9afed5a7673599aa6b47ac2ff7"} err="failed to get container status \"0c52f7d9f0d90d468c4bada184b6c712b10a1d9afed5a7673599aa6b47ac2ff7\": rpc error: code = NotFound desc = could not find container \"0c52f7d9f0d90d468c4bada184b6c712b10a1d9afed5a7673599aa6b47ac2ff7\": container with ID starting with 0c52f7d9f0d90d468c4bada184b6c712b10a1d9afed5a7673599aa6b47ac2ff7 not found: ID does not exist" Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.890231 4885 scope.go:117] "RemoveContainer" containerID="562023a648a70149fc58afe628f094f92f683c613ab725b9e4c1f564ece2ec4d" Dec 05 21:19:23 crc kubenswrapper[4885]: E1205 21:19:23.890610 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"562023a648a70149fc58afe628f094f92f683c613ab725b9e4c1f564ece2ec4d\": container with ID starting with 562023a648a70149fc58afe628f094f92f683c613ab725b9e4c1f564ece2ec4d not found: ID does not exist" containerID="562023a648a70149fc58afe628f094f92f683c613ab725b9e4c1f564ece2ec4d" Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.890634 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"562023a648a70149fc58afe628f094f92f683c613ab725b9e4c1f564ece2ec4d"} err="failed to get container status \"562023a648a70149fc58afe628f094f92f683c613ab725b9e4c1f564ece2ec4d\": rpc error: code = NotFound desc = could not find container \"562023a648a70149fc58afe628f094f92f683c613ab725b9e4c1f564ece2ec4d\": container with ID starting with 562023a648a70149fc58afe628f094f92f683c613ab725b9e4c1f564ece2ec4d not found: ID does not exist" Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.893325 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr2jv\" (UniqueName: \"kubernetes.io/projected/a166417a-6e06-46a6-ad7d-99d18d0f04d0-kube-api-access-xr2jv\") pod \"a166417a-6e06-46a6-ad7d-99d18d0f04d0\" (UID: \"a166417a-6e06-46a6-ad7d-99d18d0f04d0\") " Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.893555 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a166417a-6e06-46a6-ad7d-99d18d0f04d0-catalog-content\") pod \"a166417a-6e06-46a6-ad7d-99d18d0f04d0\" (UID: \"a166417a-6e06-46a6-ad7d-99d18d0f04d0\") " Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.893713 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a166417a-6e06-46a6-ad7d-99d18d0f04d0-utilities\") pod \"a166417a-6e06-46a6-ad7d-99d18d0f04d0\" (UID: \"a166417a-6e06-46a6-ad7d-99d18d0f04d0\") " Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.894315 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a166417a-6e06-46a6-ad7d-99d18d0f04d0-utilities" (OuterVolumeSpecName: "utilities") pod "a166417a-6e06-46a6-ad7d-99d18d0f04d0" (UID: "a166417a-6e06-46a6-ad7d-99d18d0f04d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.899950 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a166417a-6e06-46a6-ad7d-99d18d0f04d0-kube-api-access-xr2jv" (OuterVolumeSpecName: "kube-api-access-xr2jv") pod "a166417a-6e06-46a6-ad7d-99d18d0f04d0" (UID: "a166417a-6e06-46a6-ad7d-99d18d0f04d0"). InnerVolumeSpecName "kube-api-access-xr2jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.945489 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a166417a-6e06-46a6-ad7d-99d18d0f04d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a166417a-6e06-46a6-ad7d-99d18d0f04d0" (UID: "a166417a-6e06-46a6-ad7d-99d18d0f04d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.996338 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a166417a-6e06-46a6-ad7d-99d18d0f04d0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.996369 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a166417a-6e06-46a6-ad7d-99d18d0f04d0-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:19:23 crc kubenswrapper[4885]: I1205 21:19:23.996383 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr2jv\" (UniqueName: \"kubernetes.io/projected/a166417a-6e06-46a6-ad7d-99d18d0f04d0-kube-api-access-xr2jv\") on node \"crc\" DevicePath \"\"" Dec 05 21:19:24 crc kubenswrapper[4885]: I1205 21:19:24.140899 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gmxmx"] Dec 05 21:19:24 crc kubenswrapper[4885]: I1205 21:19:24.151609 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gmxmx"] Dec 05 21:19:25 crc kubenswrapper[4885]: I1205 21:19:25.188303 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a166417a-6e06-46a6-ad7d-99d18d0f04d0" path="/var/lib/kubelet/pods/a166417a-6e06-46a6-ad7d-99d18d0f04d0/volumes" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.174958 4885 scope.go:117] "RemoveContainer" containerID="f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45" Dec 05 21:19:32 crc kubenswrapper[4885]: E1205 21:19:32.175752 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.178692 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k7lm5"] Dec 05 21:19:32 crc kubenswrapper[4885]: E1205 21:19:32.179074 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a166417a-6e06-46a6-ad7d-99d18d0f04d0" containerName="extract-utilities" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.179095 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a166417a-6e06-46a6-ad7d-99d18d0f04d0" containerName="extract-utilities" Dec 05 21:19:32 crc kubenswrapper[4885]: E1205 21:19:32.179122 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a166417a-6e06-46a6-ad7d-99d18d0f04d0" containerName="registry-server" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.179129 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a166417a-6e06-46a6-ad7d-99d18d0f04d0" containerName="registry-server" Dec 05 21:19:32 crc kubenswrapper[4885]: E1205 21:19:32.179156 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a166417a-6e06-46a6-ad7d-99d18d0f04d0" containerName="extract-content" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.179162 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a166417a-6e06-46a6-ad7d-99d18d0f04d0" containerName="extract-content" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.179356 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a166417a-6e06-46a6-ad7d-99d18d0f04d0" containerName="registry-server" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.181223 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.193254 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k7lm5"] Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.368188 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbc7f\" (UniqueName: \"kubernetes.io/projected/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-kube-api-access-hbc7f\") pod \"certified-operators-k7lm5\" (UID: \"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc\") " pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.368420 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-utilities\") pod \"certified-operators-k7lm5\" (UID: \"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc\") " pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.368520 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-catalog-content\") pod \"certified-operators-k7lm5\" (UID: \"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc\") " pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.469886 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-utilities\") pod \"certified-operators-k7lm5\" (UID: \"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc\") " pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.470231 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-catalog-content\") pod \"certified-operators-k7lm5\" (UID: \"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc\") " pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.470451 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbc7f\" (UniqueName: \"kubernetes.io/projected/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-kube-api-access-hbc7f\") pod \"certified-operators-k7lm5\" (UID: \"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc\") " pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.470623 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-catalog-content\") pod \"certified-operators-k7lm5\" (UID: \"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc\") " pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.470922 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-utilities\") pod \"certified-operators-k7lm5\" (UID: \"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc\") " pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.493124 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbc7f\" (UniqueName: \"kubernetes.io/projected/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-kube-api-access-hbc7f\") pod \"certified-operators-k7lm5\" (UID: \"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc\") " pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:32 crc kubenswrapper[4885]: I1205 21:19:32.505918 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:33 crc kubenswrapper[4885]: I1205 21:19:33.113737 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k7lm5"] Dec 05 21:19:33 crc kubenswrapper[4885]: I1205 21:19:33.900657 4885 generic.go:334] "Generic (PLEG): container finished" podID="2c43ef60-e7cc-4f06-8806-bff4aa0da2dc" containerID="d9f12af153de1085c4191dc7fcad04e1267c783135479a3a5e94e6743c889759" exitCode=0 Dec 05 21:19:33 crc kubenswrapper[4885]: I1205 21:19:33.900743 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k7lm5" event={"ID":"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc","Type":"ContainerDied","Data":"d9f12af153de1085c4191dc7fcad04e1267c783135479a3a5e94e6743c889759"} Dec 05 21:19:33 crc kubenswrapper[4885]: I1205 21:19:33.900940 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k7lm5" event={"ID":"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc","Type":"ContainerStarted","Data":"55fab7031dadbdf4f4bb2c20a23e869330af6965ea0fae8b6fa4f1aae26d5c66"} Dec 05 21:19:34 crc kubenswrapper[4885]: I1205 21:19:34.910310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k7lm5" event={"ID":"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc","Type":"ContainerStarted","Data":"62ce39aeb0efb94fa677b57260cb6568c082b641c6b027f27ab7c1e6517926c2"} Dec 05 21:19:35 crc kubenswrapper[4885]: I1205 21:19:35.920898 4885 generic.go:334] "Generic (PLEG): container finished" podID="2c43ef60-e7cc-4f06-8806-bff4aa0da2dc" containerID="62ce39aeb0efb94fa677b57260cb6568c082b641c6b027f27ab7c1e6517926c2" exitCode=0 Dec 05 21:19:35 crc kubenswrapper[4885]: I1205 21:19:35.921122 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k7lm5" event={"ID":"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc","Type":"ContainerDied","Data":"62ce39aeb0efb94fa677b57260cb6568c082b641c6b027f27ab7c1e6517926c2"} Dec 05 21:19:36 crc kubenswrapper[4885]: I1205 21:19:36.932625 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k7lm5" event={"ID":"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc","Type":"ContainerStarted","Data":"291de1ccc1fe29cc6468cc144fa9eaf603590b7738f0f4e6c7462788b6d07159"} Dec 05 21:19:36 crc kubenswrapper[4885]: I1205 21:19:36.955278 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k7lm5" podStartSLOduration=2.53526985 podStartE2EDuration="4.9552595s" podCreationTimestamp="2025-12-05 21:19:32 +0000 UTC" firstStartedPulling="2025-12-05 21:19:33.903647969 +0000 UTC m=+4439.200463630" lastFinishedPulling="2025-12-05 21:19:36.323637619 +0000 UTC m=+4441.620453280" observedRunningTime="2025-12-05 21:19:36.9498192 +0000 UTC m=+4442.246634871" watchObservedRunningTime="2025-12-05 21:19:36.9552595 +0000 UTC m=+4442.252075151" Dec 05 21:19:42 crc kubenswrapper[4885]: I1205 21:19:42.507056 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:42 crc kubenswrapper[4885]: I1205 21:19:42.507671 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:42 crc kubenswrapper[4885]: I1205 21:19:42.568320 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:43 crc kubenswrapper[4885]: I1205 21:19:43.083166 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:43 crc kubenswrapper[4885]: I1205 21:19:43.129533 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k7lm5"] Dec 05 21:19:44 crc kubenswrapper[4885]: I1205 21:19:44.173345 4885 scope.go:117] "RemoveContainer" containerID="f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45" Dec 05 21:19:44 crc kubenswrapper[4885]: E1205 21:19:44.173548 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:19:45 crc kubenswrapper[4885]: I1205 21:19:45.021417 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k7lm5" podUID="2c43ef60-e7cc-4f06-8806-bff4aa0da2dc" containerName="registry-server" containerID="cri-o://291de1ccc1fe29cc6468cc144fa9eaf603590b7738f0f4e6c7462788b6d07159" gracePeriod=2 Dec 05 21:19:46 crc kubenswrapper[4885]: I1205 21:19:46.037945 4885 generic.go:334] "Generic (PLEG): container finished" podID="2c43ef60-e7cc-4f06-8806-bff4aa0da2dc" containerID="291de1ccc1fe29cc6468cc144fa9eaf603590b7738f0f4e6c7462788b6d07159" exitCode=0 Dec 05 21:19:46 crc kubenswrapper[4885]: I1205 21:19:46.038087 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k7lm5" event={"ID":"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc","Type":"ContainerDied","Data":"291de1ccc1fe29cc6468cc144fa9eaf603590b7738f0f4e6c7462788b6d07159"} Dec 05 21:19:46 crc kubenswrapper[4885]: I1205 21:19:46.352994 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:46 crc kubenswrapper[4885]: I1205 21:19:46.570983 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-utilities" (OuterVolumeSpecName: "utilities") pod "2c43ef60-e7cc-4f06-8806-bff4aa0da2dc" (UID: "2c43ef60-e7cc-4f06-8806-bff4aa0da2dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:19:46 crc kubenswrapper[4885]: I1205 21:19:46.569989 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-utilities\") pod \"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc\" (UID: \"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc\") " Dec 05 21:19:46 crc kubenswrapper[4885]: I1205 21:19:46.571268 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-catalog-content\") pod \"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc\" (UID: \"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc\") " Dec 05 21:19:46 crc kubenswrapper[4885]: I1205 21:19:46.571397 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbc7f\" (UniqueName: \"kubernetes.io/projected/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-kube-api-access-hbc7f\") pod \"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc\" (UID: \"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc\") " Dec 05 21:19:46 crc kubenswrapper[4885]: I1205 21:19:46.572465 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:19:46 crc kubenswrapper[4885]: I1205 21:19:46.582632 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-kube-api-access-hbc7f" (OuterVolumeSpecName: "kube-api-access-hbc7f") pod "2c43ef60-e7cc-4f06-8806-bff4aa0da2dc" (UID: "2c43ef60-e7cc-4f06-8806-bff4aa0da2dc"). InnerVolumeSpecName "kube-api-access-hbc7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:19:46 crc kubenswrapper[4885]: I1205 21:19:46.619531 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c43ef60-e7cc-4f06-8806-bff4aa0da2dc" (UID: "2c43ef60-e7cc-4f06-8806-bff4aa0da2dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:19:46 crc kubenswrapper[4885]: I1205 21:19:46.673463 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbc7f\" (UniqueName: \"kubernetes.io/projected/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-kube-api-access-hbc7f\") on node \"crc\" DevicePath \"\"" Dec 05 21:19:46 crc kubenswrapper[4885]: I1205 21:19:46.673500 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:19:47 crc kubenswrapper[4885]: I1205 21:19:47.075269 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k7lm5" event={"ID":"2c43ef60-e7cc-4f06-8806-bff4aa0da2dc","Type":"ContainerDied","Data":"55fab7031dadbdf4f4bb2c20a23e869330af6965ea0fae8b6fa4f1aae26d5c66"} Dec 05 21:19:47 crc kubenswrapper[4885]: I1205 21:19:47.075584 4885 scope.go:117] "RemoveContainer" containerID="291de1ccc1fe29cc6468cc144fa9eaf603590b7738f0f4e6c7462788b6d07159" Dec 05 21:19:47 crc kubenswrapper[4885]: I1205 21:19:47.075375 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k7lm5" Dec 05 21:19:47 crc kubenswrapper[4885]: I1205 21:19:47.124211 4885 scope.go:117] "RemoveContainer" containerID="62ce39aeb0efb94fa677b57260cb6568c082b641c6b027f27ab7c1e6517926c2" Dec 05 21:19:47 crc kubenswrapper[4885]: I1205 21:19:47.136309 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k7lm5"] Dec 05 21:19:47 crc kubenswrapper[4885]: I1205 21:19:47.142415 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k7lm5"] Dec 05 21:19:47 crc kubenswrapper[4885]: I1205 21:19:47.202278 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c43ef60-e7cc-4f06-8806-bff4aa0da2dc" path="/var/lib/kubelet/pods/2c43ef60-e7cc-4f06-8806-bff4aa0da2dc/volumes" Dec 05 21:19:47 crc kubenswrapper[4885]: I1205 21:19:47.721343 4885 scope.go:117] "RemoveContainer" containerID="d9f12af153de1085c4191dc7fcad04e1267c783135479a3a5e94e6743c889759" Dec 05 21:19:56 crc kubenswrapper[4885]: I1205 21:19:56.172938 4885 scope.go:117] "RemoveContainer" containerID="f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45" Dec 05 21:19:56 crc kubenswrapper[4885]: E1205 21:19:56.173737 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:20:08 crc kubenswrapper[4885]: I1205 21:20:08.173335 4885 scope.go:117] "RemoveContainer" containerID="f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45" Dec 05 21:20:08 crc kubenswrapper[4885]: E1205 21:20:08.174131 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:20:22 crc kubenswrapper[4885]: I1205 21:20:22.174594 4885 scope.go:117] "RemoveContainer" containerID="f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45" Dec 05 21:20:22 crc kubenswrapper[4885]: E1205 21:20:22.175843 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1" Dec 05 21:20:33 crc kubenswrapper[4885]: I1205 21:20:33.173367 4885 scope.go:117] "RemoveContainer" containerID="f9899387cbbfef33a6d8e3ea3b88aebae9b0e87cfeabdbf7f9d9086fd61f7b45" Dec 05 21:20:33 crc kubenswrapper[4885]: E1205 21:20:33.174331 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5m8lc_openshift-machine-config-operator(21ee2046-c3c1-4501-abe5-0ac10ddfeaf1)\"" pod="openshift-machine-config-operator/machine-config-daemon-5m8lc" podUID="21ee2046-c3c1-4501-abe5-0ac10ddfeaf1"